Einleitung

Im Rahmen des Algorithmus und Statistik 2 Lab wird hier ein Datensatz zur vertieften Analyse und Modellierung vorgestellt. Die Gruppe (Gruppe L) setzt sich aus folgenden Mitgliedern zusammen:

Datensatz

Beschreibung

Die Datengrundlage für das Projekt kann in einer Kaggle Challenge gefunden werden. Die Daten beinhalten anonymisierte Informationen über Verdachtsfälle von Corona aus dem Israelita Albert Einstein Hospital in Sauó Paulo, Brasilien. Hier wurden 5644 Patienten auf Corona getestet. Auch werden Daten über Vorerkrankungen, Blutwerten, Alter und Weiteres zur Verfügung gestellt. Der Datensatz wurde bereits im Kontext des Machine Learning Lab 1 aufbereitet; die dort durchgeführten Schritte sind unten zu finden. Die Spalten, welche übergeblieben sind sind folgende:

  • target: corona test resultat
  • Patient.age.quantile: Alter des Patienten als quantil
  • sickness: Patient hat Vorerkrankung
  • Patient.addmited.to.regular.ward..1.yes..0.no.: Patient wurde im Krankenhaus aufgenommen
  • Patient.addmited.to.semi.intensive.unit..1.yes..0.no.: Patent wurde auf eine Vorstufe der Intensivstation aufgenommen
  • Patient.addmited.to.intensive.care.unit..1.yes..0.no.: Patient wurde auf Intensivstation aufgenommen
  • Hematocrit: Hämatokrit Konzentration
  • Platelets: Thrombozyten Konzentration
  • Mean.platelet.volume: Mittlere Thrombozyten Volumen
  • Lymphocytes: Lymphocyten Konzentration
  • Mean.corpuscular.hemoglobin.concentrationÂ..MCHC.: a measure of the concentration of haemoglobin in a given volume of packed red blood cell.
  • Leukocytes: Leukocytes Konzentration
  • Basophils: white blood cells from the bone marrow, Konzentration
  • Mean.corpuscular.hemoglobin..MCH.
  • Eosinophils
  • Monocytes: Monozyten Konzentration
  • Red.blood.cell.distribution.width..RDW.: Spannweite der Verteilung der roten Blutkörper

Alle kontinuierlichen Daten sind zentriert. Die Daten beinhalten nur 8,4% corona-positive Patienten.

Cleaning

In einem jupyter notebook wurden schon einige Schritte des Preprocessing durchgeführt. Die meisten davon waren notwendig, um die große Anzahl an Nullwerten zu entfernen/ersetzen. Folgende Schritte wurden durchgeführt:

  • Alle Spalten mit mehr als 90% NAs wurden entfernt
  • Die Patienten ID wurde entfernt
  • Konstante Spalten wurden entfernt
  • Spalten die zu einer anderen zu 0.85 Korrelation zeigen wurden entfernt
  • Zeilen die einen hohen Wert an NAs beinhalten wurden entfernt
  • Die Spalte sickness wurde erstellt. Sie beträgt 1, falls einer von den vielen durchgeführten (nicht-corona) Tests positiv ist. Sprich, wenn eine Vorerkrankung besteht.
  • Die restlichen NAs wurden mit KNN imputiert. Da viele Werte nur NAs als Nachbarn haben, wurde meist der mean als Alternative genommen. Das resultiert darin, dass extrem viele Werte um 0 (Daten sind zentriert) liegen.

Vorbereitung in R

Daten einlesen

library(kableExtra)
library(knitr)
library(dplyr)
library(caret)
data_clean <- read.csv("data/clean/data_clean.csv")
str(data_clean)
## 'data.frame':    532 obs. of  17 variables:
##  $ Patient.age.quantile                                 : int  17 1 9 11 0 13 14 9 8 17 ...
##  $ target                                               : int  0 0 0 0 0 0 0 0 0 0 ...
##  $ Patient.addmited.to.regular.ward..1.yes..0.no.       : int  0 0 0 0 0 0 0 1 0 0 ...
##  $ Patient.addmited.to.semi.intensive.unit..1.yes..0.no.: int  0 1 0 0 0 0 0 0 0 0 ...
##  $ Patient.addmited.to.intensive.care.unit..1.yes..0.no.: int  0 0 0 0 0 0 0 0 0 0 ...
##  $ sickness                                             : int  1 0 1 1 1 1 0 1 0 1 ...
##  $ Hematocrit                                           : num  0.2365 -1.5717 -0.7477 0.9918 -0.0742 ...
##  $ Platelets                                            : num  -0.5174 1.4297 -0.4295 0.073 -0.0326 ...
##  $ Mean.platelet.volume                                 : num  0.01068 -1.67222 -0.21371 -0.55029 -0.00447 ...
##  $ Lymphocytes                                          : num  0.31837 -0.00574 -1.11451 0.04544 -0.07253 ...
##  $ Mean.corpuscular.hemoglobin.concentration..MCHC.     : num  -0.9508 3.3311 0.5429 -0.4529 -0.0459 ...
##  $ Leukocytes                                           : num  -0.0946 0.3646 -0.8849 -0.2115 0.0432 ...
##  $ Basophils                                            : num  -0.2238 -0.2238 0.0817 -0.8347 -0.0251 ...
##  $ Mean.corpuscular.hemoglobin..MCH.                    : num  -0.2923 0.1782 1.7463 0.335 -0.0515 ...
##  $ Eosinophils                                          : num  1.4822 1.0186 -0.667 -0.7091 0.0103 ...
##  $ Monocytes                                            : num  0.3575 0.0687 1.2768 -0.2202 0.0421 ...
##  $ Red.blood.cell.distribution.width..RDW.              : num  -0.625 -0.979 -1.067 0.171 0.086 ...
data_clean %>% head() %>% kable() %>% kable_styling(font_size = 6)
Patient.age.quantile target Patient.addmited.to.regular.ward..1.yes..0.no. Patient.addmited.to.semi.intensive.unit..1.yes..0.no. Patient.addmited.to.intensive.care.unit..1.yes..0.no. sickness Hematocrit Platelets Mean.platelet.volume Lymphocytes Mean.corpuscular.hemoglobin.concentration..MCHC. Leukocytes Basophils Mean.corpuscular.hemoglobin..MCH. Eosinophils Monocytes Red.blood.cell.distribution.width..RDW.
17 0 0 0 0 1 0.2365154 -0.5174130 0.0106766 0.3183658 -0.9507903 -0.0946103 -0.2237665 -0.2922693 1.4821582 0.3575467 -0.6250727
1 0 0 1 0 0 -1.5716822 1.4296675 -1.6722218 -0.0057380 3.3310707 0.3645505 -0.2237665 0.1781750 1.0186250 0.0686515 -0.9788991
9 0 0 0 0 1 -0.7476931 -0.4294803 -0.2137107 -1.1145138 0.5428824 -0.8849232 0.0816925 1.7463233 -0.6669502 1.2767589 -1.0673550
11 0 0 0 0 1 0.9918382 0.0729920 -0.5502895 0.0454363 -0.4528995 -0.2114877 -0.8346847 0.3349894 -0.7090895 -0.2202439 0.1710353
0 0 0 0 0 1 -0.0742315 -0.0325821 -0.0044684 -0.0725255 -0.0458798 0.0432139 -0.0251347 -0.0514773 0.0102734 0.0421004 0.0859626
13 0 0 0 0 1 1.0147264 -0.1782442 0.7960289 -0.7307069 -0.3533190 -0.0751308 2.5253651 0.5440767 0.2179768 0.0686515 0.1710353
transform_type <- function(df){
  # iterate over columns 
  for (col_oi in colnames(df)){
    
    # and transform to factor (if it has little unique values)
    if (df[, col_oi] %>% unique() %>% length() < 10){
      df[, col_oi] <- as.factor(df[, col_oi])
      
    }else{
      # else transform to numeric
      df[, col_oi] <- as.numeric(df[, col_oi])
    }
  }
  return(df)
}

data_clean <- transform_type(data_clean)
print(str(data_clean))
## 'data.frame':    532 obs. of  17 variables:
##  $ Patient.age.quantile                                 : num  17 1 9 11 0 13 14 9 8 17 ...
##  $ target                                               : Factor w/ 2 levels "0","1": 1 1 1 1 1 1 1 1 1 1 ...
##  $ Patient.addmited.to.regular.ward..1.yes..0.no.       : Factor w/ 2 levels "0","1": 1 1 1 1 1 1 1 2 1 1 ...
##  $ Patient.addmited.to.semi.intensive.unit..1.yes..0.no.: Factor w/ 2 levels "0","1": 1 2 1 1 1 1 1 1 1 1 ...
##  $ Patient.addmited.to.intensive.care.unit..1.yes..0.no.: Factor w/ 2 levels "0","1": 1 1 1 1 1 1 1 1 1 1 ...
##  $ sickness                                             : Factor w/ 2 levels "0","1": 2 1 2 2 2 2 1 2 1 2 ...
##  $ Hematocrit                                           : num  0.2365 -1.5717 -0.7477 0.9918 -0.0742 ...
##  $ Platelets                                            : num  -0.5174 1.4297 -0.4295 0.073 -0.0326 ...
##  $ Mean.platelet.volume                                 : num  0.01068 -1.67222 -0.21371 -0.55029 -0.00447 ...
##  $ Lymphocytes                                          : num  0.31837 -0.00574 -1.11451 0.04544 -0.07253 ...
##  $ Mean.corpuscular.hemoglobin.concentration..MCHC.     : num  -0.9508 3.3311 0.5429 -0.4529 -0.0459 ...
##  $ Leukocytes                                           : num  -0.0946 0.3646 -0.8849 -0.2115 0.0432 ...
##  $ Basophils                                            : num  -0.2238 -0.2238 0.0817 -0.8347 -0.0251 ...
##  $ Mean.corpuscular.hemoglobin..MCH.                    : num  -0.2923 0.1782 1.7463 0.335 -0.0515 ...
##  $ Eosinophils                                          : num  1.4822 1.0186 -0.667 -0.7091 0.0103 ...
##  $ Monocytes                                            : num  0.3575 0.0687 1.2768 -0.2202 0.0421 ...
##  $ Red.blood.cell.distribution.width..RDW.              : num  -0.625 -0.979 -1.067 0.171 0.086 ...
## NULL

Visualization

Hier erkennt man recht gut, dass bei allen Blutwerten so wenig Daten vorhanden waren, dass es beim KNN-imputieren mit dem Mittelwert (ca. 0) ersetzt wurde.

plot_col <- function(df, col){
  g <- ggplot(data = df, mapping = aes_string(col)) 
  if(is.numeric(df[,col])){
    g <- g + geom_histogram(position = "identity", aes(fill=target))
  }else{
    g <- g + geom_histogram(stat = "count", aes(fill=target))
  }
  print(g)
}

for (col in colnames(data_clean)){
  plot_col(data_clean, col)
}

Train Test Split

set.seed(3456)
train_idx <- createDataPartition(data_clean$target, p = .8, 
                                  list = FALSE, 
                                  times = 1)

data_train <- data_clean[train_idx, ]
data_test <- data_clean[-train_idx, ]

print(dim(data_train))
## [1] 427  17
print(dim(data_test))
## [1] 105  17
# write.csv(x = data_train, file = "data/clean/train.csv", row.names = F)
# write.csv(x = data_test, file = "data/clean/test.csv", row.names = F)

Modellierbarkeit

first_model <-  glm(target ~ sickness + Patient.age.quantile + Hematocrit, data = data_train, family = "binomial")
print(summary(first_model))
## 
## Call:
## glm(formula = target ~ sickness + Patient.age.quantile + Hematocrit, 
##     family = "binomial", data = data_train)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.1433  -0.5064  -0.1623  -0.0767   3.5838  
## 
## Coefficients:
##                      Estimate Std. Error z value Pr(>|z|)    
## (Intercept)          -2.64987    0.48780  -5.432 5.56e-08 ***
## sickness1            -3.37686    0.73459  -4.597 4.29e-06 ***
## Patient.age.quantile  0.11574    0.03481   3.325 0.000886 ***
## Hematocrit            0.43911    0.18603   2.360 0.018252 *  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 296.05  on 426  degrees of freedom
## Residual deviance: 216.87  on 423  degrees of freedom
## AIC: 224.87
## 
## Number of Fisher Scoring iterations: 7
preds <- predict(object = first_model, newdata = data_test, type = "response")
print(preds[1:10])
##           5           7           9          17          18          32 
## 0.002330532 0.330809774 0.165192929 0.002615753 0.009281317 0.009281317 
##          41          59          64          69 
## 0.005224621 0.206027145 0.021674737 0.009150683

Ziele und Erwartungen

Das Ziel dieses Projektes ist es, ein Modell zu entwerfen, dass basierend auf den obig genannten Spalten den Ausgang des Corona Tests vorhersagen kann. Das bisherige Modell hat zwar eine Präzision von 0.91, jedoch einen recall von nur 0.33. Diesen gilt es zu heben. Die große Anzahl an mean imputed Werten (siehe Beschreibung oben) könnten ein Verhängnis werden. Wir haben uns zum Ziel gesetzt ein Modell zu entwickeln, dass zuverlässig unterscheiden kann zwischen Corona Infizierten und Nicht Corona Infizierten. Wir sind uns bewusst, dass dies ein sehr schwieriges Unterfangen ist. Eventuell gelingt es uns ein Modell zu entwickeln, dass für bereits eine grobe Klassifikation vornehmen kann. Damit im Nachgang für die “unsicheren” Patienten nach einem zweiten Test Gewissheit herrscht.

Feature engineering

Im unten abgebildeten Code beschäftigen wir uns mit feature engineering. Hierbei wählen wir zunächst die interressanten numerischen features aus und transformieren und kombinieren sie so, dass die corona erkrankten im Schnitt einen höhrer Wert erlangen. So generieren wir einige features. Bevor wir foreward selection anwenden, um zwei features hinzuzufügen, entfernen wir zwei durch backward selection.

###########################
# get data
###########################

data_train <- read.csv("data/clean/train.csv")
data_test <- read.csv("data/clean/test.csv")

transform_type <- function(df){
  # iterate over columns 
  for (col_oi in colnames(df)){
    
    # and transform to factor (if it has little unique values)
    if (df[, col_oi] %>% unique() %>% length() < 10){
      df[, col_oi] <- as.factor(df[, col_oi])
      
    }else{
      # else transform to numeric
      df[, col_oi] <- as.numeric(df[, col_oi])
    }
  }
  return(df)
}

#####
# upsample
#####

data_train <- transform_type(data_train)
data_test <- transform_type(data_test)

data_train_up <- upSample(x = data_train[, -ncol(data_train)],
                         y = data_train$target)
data_train_up <- data_train_up %>%
  select(-Class)



#############
# interesting features
#############

# look at the interesting features and first map them to the intervall [0,1]. Then (we want the final)
# variable have large values for corona patients) map small values to large ones (1-x). Finally 
# take the e function to guarantee that all values are positive (the upper described transformation could
# otherwise result in negative values for the test data)

extract_feat <- function(df){
  feat_oi <- list(
    "age" = exp((df$Patient.age.quantile - min(data_train_up$Patient.age.quantile))/(max(data_train_up$Patient.age.quantile) - min(data_train_up$Patient.age.quantile))),
    "plat" = exp(1 - (df$Platelets - min(data_train_up$Platelets))/(max(data_train_up$Platelets) - min(data_train_up$Platelets))),
    "leuk" = exp(1 - (df$Leukocytes - min(data_train_up$Leukocytes))/(max(data_train_up$Leukocytes) - min(data_train_up$Leukocytes))),
    "eos" = exp(1 - (df$Eosinophils - min(data_train_up$Eosinophils))/(max(data_train_up$Eosinophils) - min(data_train_up$Eosinophils)))
  )
  return(feat_oi)
}

feat_train <- extract_feat(data_train_up)
feat_test <- extract_feat(data_test)

# create all possible combinations of the three variables
comb_table <- combn(x = names(feat_train), 3)


# function that multiplies the features and returns vector of the result
create_feature <- function(name_vec, feat_oi){
  res <- 1
  for(name in name_vec){
    res <- res * feat_oi[[name]]
  }
  return(res)
}


# function that returns a list with all the combinations. Each element in list has a proper name.
# e.g (aes_dfs_dsf for the vector aes dfs dsf)
create_namevecs <- function(){
  name_vec_list <- list()
  list_names <- c()
  for (i in 1:ncol(comb_table)){
    list_names <- c(list_names, paste(comb_table[,i], collapse = "_"))
    name_vec_list[[length(name_vec_list) + 1]] <- comb_table[,i]
  }
  # add combination of all four cols 
  list_names <- c(list_names, paste(names(feat_train), collapse = "_"))
  name_vec_list[[length(name_vec_list) + 1]] <- names(feat_train)

  names(name_vec_list) <- list_names
  return(name_vec_list)
}

feat_list <- create_namevecs()

#####
# create the feature data frame for the training and test
#####
for (feat_name in names(feat_list)){
  eval(parse(text=paste0(feat_name, "= create_feature(feat_list[[feat_name]], feat_train)")))
}

eval(parse(text=paste0("feat_df_train = data.frame(", paste(names(feat_list), collapse = ','),")")))

for (feat_name in names(feat_list)){
  eval(parse(text=paste0(feat_name, "= create_feature(feat_list[[feat_name]], feat_test)")))
}

eval(parse(text=paste0("feat_df_test = data.frame(", paste(names(feat_list), collapse = ','),")")))

####################
# plot created features
####################

for (col in colnames(feat_df_train)){
  plot_df <- data.frame(target = data_train_up$target, feat = feat_df_train[, col])
  print(ggplot(data = plot_df) + geom_histogram(mapping = aes_string(fill = "target", x = "feat")) + xlab(col))
}

for (col in colnames(feat_df_train)){
  plot_df <- data.frame(target = data_test$target, feat = feat_df_test[, col])
  print(ggplot(data = plot_df) + geom_histogram(mapping = aes_string(fill = "target", x = "feat")) + xlab(col))
}


####################
# feature selection
####################

# to evaluate the the dataframe we will create a simple svm with similar parameters to @Louis svm. 
# returns the test accuracy. Ideally we would do this on an validation dataframe, but the dataset is 
# too small for that. 
eval_df <- function(data_train_oi, data_test_oi){
  set.seed(123)
  
  fitControl <- trainControl(## 10-fold CV
    method = "repeatedcv",
    number = 10,
    ## repeated ten times
    repeats = 3)
  
  svm_fit_radial <- train(target ~ ., data = data_train_oi, 
                          method = "svmRadial", 
                          trainControl = fitControl)
  
  prediction_radial <- svm_fit_radial %>% predict(data_test_oi)
  return(mean(prediction_radial == data_test_oi$target))
}


###################
# Backward elimination
###################

# initial values
final_df_train <- data_train_up
final_df_test <- data_test
unselected_feat <- colnames(feat_df_train)
acc_thresh <- eval_df(data_train_oi = data_train_up, 
                      data_test_oi = data_test)
selected_feat <- c()
dropped_feat <- c()

# here we will drop a feature in each interation (in case it makes an improvement to the accuracy)
for (iter in c(1,2)) {
  feat_to_drop <- NA
  for (feat_oi in colnames(final_df_train)[colnames(final_df_train) != "target"]) {
    tmp_df_train <- final_df_train[,! colnames(final_df_train) %in% c(feat_oi)]
    tmp_df_test <- final_df_test[,! colnames(final_df_test) %in% c(feat_oi)]

    tmp_acc <- eval_df(data_train_oi = tmp_df_train, 
                       data_test_oi = tmp_df_test)
    print(feat_oi)
    print(tmp_acc)
    if (tmp_acc >= acc_thresh){
      acc_thresh <- tmp_acc
      feat_to_drop <- feat_oi
    }
  }
  
  if(!is.na(feat_to_drop)){
    dropped_feat <- c(dropped_feat, feat_to_drop)
    final_df_test <- final_df_test[,!colnames(final_df_test) %in% dropped_feat]
    final_df_train <- final_df_train[,!colnames(final_df_train) %in% dropped_feat]
  }
}

##########################
# Forward selection
##########################

# here we will select one of our created feature in each of the two iterations, 
# that improve our model the most. 
for (iter in c(1,2)) {
  feat_to_select <- NA
  for (feat_oi in unselected_feat) {
    tmp_df_train <- final_df_train
    tmp_df_test <- final_df_test
    tmp_df_train[, feat_oi] <- feat_df_train[, feat_oi]
    tmp_df_test[, feat_oi] <- feat_df_test[, feat_oi]
    print(tmp_df_train[, feat_oi][1:5])
    tmp_acc <- eval_df(data_train_oi = tmp_df_train, 
                       data_test_oi = tmp_df_test)
    print(tmp_acc)
    if (tmp_acc >= acc_thresh){
      acc_thresh <- tmp_acc
      feat_to_select <- feat_oi
    }
  }
  if(!is.na(feat_to_select)){
    selected_feat <- c(selected_feat, feat_to_select)
    unselected_feat <- unselected_feat[-which(unselected_feat == feat_to_select)]
    final_df_test[, feat_to_select] <- feat_df_test[, feat_to_select]
    final_df_train[, feat_to_select] <- feat_df_train[, feat_to_select]
  }
}

# write.csv(final_df_test, "data/clean/test_feat_eng.csv", row.names = F)
# write.csv(final_df_train, "data/clean/train_feat_eng.csv", row.names = F)

Unsupervised Methode

Wir verwenden nun noch eine Unsupvised Learning Methode mit einem K-means Clustering, um einen Überblick zu bekommen und zu prüfen, ob die Daten mit einen K-Means CLustering gut zu clustern sind oder ob es keinen Sinn macht. Wir clustern die gesamten Daten mit einem K= 2, für die beiden Outputs (krank oder gesund). Wir werden eine Principal Component Analyse durchführen und anhand der ersten beiden Hauptkomponenten prüfen, ob sich dadurch ein sinnvolles Clustering ergibt. Dadurch erwarten wir uns noch einen etwas besseren Überblick über die Komplexität des Datensatzes.

clust_data <- data_clean
clust_data$Patient.age.quantile <- as.numeric(clust_data$Patient.age.quantile)
clust_data$Patient.addmited.to.regular.ward..1.yes..0.no. <- as.numeric(clust_data$Patient.addmited.to.regular.ward..1.yes..0.no.)
clust_data$Patient.addmited.to.semi.intensive.unit..1.yes..0.no. <- as.numeric(clust_data$Patient.addmited.to.semi.intensive.unit..1.yes..0.no.)
clust_data$Patient.addmited.to.intensive.care.unit..1.yes..0.no. <- as.numeric(clust_data$Patient.addmited.to.intensive.care.unit..1.yes..0.no.)
clust_data$sickness <- as.numeric(clust_data$sickness)

clust_data <- clust_data %>%
  select(-target)
kmeans <- kmeans(clust_data , centers=2, nstart = 10)
cluster_sizes <- kmeans$size
cluster_sizes
## [1] 312 220

Das Clustering ergibt deutlich balanciertere Daten als wir es in unserem Datensatz haben. Dies lässt bereits darauf schließen, dass ein K-Means CLustering wenig sinnvoll ist für diese Daten.

clust_pcov <- prcomp(clust_data, scale=T)
clust_pcov
## Standard deviations (1, .., p=16):
##  [1] 1.5406691 1.3356209 1.2616640 1.1186067 1.0553420 1.0478731 1.0304667
##  [8] 0.9876587 0.9507127 0.8656555 0.8164030 0.8055641 0.7411525 0.7217464
## [15] 0.6273475 0.5637475
## 
## Rotation (n x k) = (16 x 16):
##                                                               PC1          PC2
## Patient.age.quantile                                  -0.19413846  0.187449444
## Patient.addmited.to.regular.ward..1.yes..0.no.        -0.11065261  0.200066222
## Patient.addmited.to.semi.intensive.unit..1.yes..0.no.  0.13550734  0.049507378
## Patient.addmited.to.intensive.care.unit..1.yes..0.no.  0.27786824  0.004503093
## sickness                                               0.16927561 -0.034312565
## Hematocrit                                            -0.23517886 -0.217172256
## Platelets                                              0.36640227 -0.165608793
## Mean.platelet.volume                                  -0.23751784  0.198702641
## Lymphocytes                                           -0.20421836  0.133540280
## Mean.corpuscular.hemoglobin.concentration..MCHC.      -0.20591512 -0.519981890
## Leukocytes                                             0.48156234 -0.197934987
## Basophils                                             -0.28860316  0.193215650
## Mean.corpuscular.hemoglobin..MCH.                     -0.28691907 -0.370110384
## Eosinophils                                           -0.06951921  0.087673222
## Monocytes                                             -0.25795570  0.019459641
## Red.blood.cell.distribution.width..RDW.                0.16391377  0.545291771
##                                                               PC3         PC4
## Patient.age.quantile                                  -0.29604962  0.46829930
## Patient.addmited.to.regular.ward..1.yes..0.no.        -0.27695513  0.29334910
## Patient.addmited.to.semi.intensive.unit..1.yes..0.no. -0.21412471 -0.14232917
## Patient.addmited.to.intensive.care.unit..1.yes..0.no. -0.05310285  0.13022932
## sickness                                               0.21545327 -0.46293032
## Hematocrit                                             0.09525855  0.02619313
## Platelets                                              0.33877008  0.35048068
## Mean.platelet.volume                                  -0.19493342 -0.22389565
## Lymphocytes                                            0.43142098 -0.08987101
## Mean.corpuscular.hemoglobin.concentration..MCHC.      -0.10953932  0.03787172
## Leukocytes                                            -0.09743705  0.17946983
## Basophils                                              0.31817921  0.25383040
## Mean.corpuscular.hemoglobin..MCH.                     -0.09667817  0.20079521
## Eosinophils                                            0.50628362  0.27491373
## Monocytes                                              0.02963286 -0.21294148
## Red.blood.cell.distribution.width..RDW.               -0.05761826  0.04558037
##                                                               PC5         PC6
## Patient.age.quantile                                  -0.04702191 -0.26324880
## Patient.addmited.to.regular.ward..1.yes..0.no.         0.09750949  0.47077981
## Patient.addmited.to.semi.intensive.unit..1.yes..0.no.  0.54870025 -0.47578755
## Patient.addmited.to.intensive.care.unit..1.yes..0.no. -0.20589955  0.10253999
## sickness                                              -0.18411684 -0.17999553
## Hematocrit                                            -0.54908860  0.03262305
## Platelets                                              0.07493366 -0.03965261
## Mean.platelet.volume                                  -0.34805248 -0.40708585
## Lymphocytes                                            0.30649224  0.12486890
## Mean.corpuscular.hemoglobin.concentration..MCHC.       0.15162020 -0.01471974
## Leukocytes                                            -0.19942928 -0.13282691
## Basophils                                             -0.08342793 -0.15990533
## Mean.corpuscular.hemoglobin..MCH.                      0.08200533 -0.29893038
## Eosinophils                                            0.02218811 -0.26255654
## Monocytes                                              0.08079865  0.20697350
## Red.blood.cell.distribution.width..RDW.               -0.07250742 -0.10615734
##                                                                PC7
## Patient.age.quantile                                  -0.246002221
## Patient.addmited.to.regular.ward..1.yes..0.no.         0.052326491
## Patient.addmited.to.semi.intensive.unit..1.yes..0.no. -0.212853970
## Patient.addmited.to.intensive.care.unit..1.yes..0.no.  0.504724096
## sickness                                               0.007698463
## Hematocrit                                            -0.380935192
## Platelets                                             -0.191776483
## Mean.platelet.volume                                   0.241496891
## Lymphocytes                                            0.232670184
## Mean.corpuscular.hemoglobin.concentration..MCHC.       0.276274605
## Leukocytes                                            -0.085039378
## Basophils                                              0.133941487
## Mean.corpuscular.hemoglobin..MCH.                      0.207003553
## Eosinophils                                           -0.028286083
## Monocytes                                             -0.438143504
## Red.blood.cell.distribution.width..RDW.                0.071877480
##                                                                 PC8         PC9
## Patient.age.quantile                                   0.0008153688 -0.21397753
## Patient.addmited.to.regular.ward..1.yes..0.no.         0.2773029833  0.46945473
## Patient.addmited.to.semi.intensive.unit..1.yes..0.no. -0.1456334286  0.03769581
## Patient.addmited.to.intensive.care.unit..1.yes..0.no. -0.0015993409 -0.56173561
## sickness                                               0.5361970886  0.19604898
## Hematocrit                                            -0.2859409128  0.05510906
## Platelets                                             -0.0674780516 -0.04045282
## Mean.platelet.volume                                  -0.1971974895  0.09335023
## Lymphocytes                                           -0.3863375713 -0.04918774
## Mean.corpuscular.hemoglobin.concentration..MCHC.       0.1076684460 -0.01713134
## Leukocytes                                             0.0102215046  0.12417809
## Basophils                                              0.1039656111  0.10748252
## Mean.corpuscular.hemoglobin..MCH.                      0.3157985674 -0.07131256
## Eosinophils                                            0.2386597879  0.05484507
## Monocytes                                              0.3412756274 -0.55357709
## Red.blood.cell.distribution.width..RDW.                0.2103859605 -0.14425127
##                                                              PC10        PC11
## Patient.age.quantile                                  -0.44573404  0.09690978
## Patient.addmited.to.regular.ward..1.yes..0.no.         0.03197212  0.12138688
## Patient.addmited.to.semi.intensive.unit..1.yes..0.no.  0.21507492  0.42590957
## Patient.addmited.to.intensive.care.unit..1.yes..0.no.  0.04958867  0.41615198
## sickness                                              -0.35259674  0.28226539
## Hematocrit                                            -0.00293928  0.36596444
## Platelets                                             -0.02469238 -0.18878830
## Mean.platelet.volume                                   0.16137064 -0.37419368
## Lymphocytes                                           -0.34974919 -0.04557123
## Mean.corpuscular.hemoglobin.concentration..MCHC.       0.27257934 -0.04444997
## Leukocytes                                             0.15971779 -0.21861477
## Basophils                                              0.45577586  0.32460162
## Mean.corpuscular.hemoglobin..MCH.                     -0.30173545 -0.10022085
## Eosinophils                                            0.11580787 -0.10220747
## Monocytes                                              0.25408350 -0.17989946
## Red.blood.cell.distribution.width..RDW.                0.04147078 -0.14819267
##                                                              PC12         PC13
## Patient.age.quantile                                  -0.07044842  0.085030759
## Patient.addmited.to.regular.ward..1.yes..0.no.        -0.15045394  0.414497044
## Patient.addmited.to.semi.intensive.unit..1.yes..0.no. -0.08429535  0.178395159
## Patient.addmited.to.intensive.care.unit..1.yes..0.no. -0.20362330  0.217562189
## sickness                                               0.03915194  0.225689016
## Hematocrit                                            -0.03332508  0.014972940
## Platelets                                              0.24923027  0.383961840
## Mean.platelet.volume                                  -0.11895083  0.453558552
## Lymphocytes                                            0.06219777  0.315948429
## Mean.corpuscular.hemoglobin.concentration..MCHC.      -0.07713866 -0.032444164
## Leukocytes                                             0.05943353  0.210863189
## Basophils                                              0.48092570  0.045259874
## Mean.corpuscular.hemoglobin..MCH.                      0.26043437  0.005039532
## Eosinophils                                           -0.68683446 -0.118737823
## Monocytes                                              0.01062471  0.325450043
## Red.blood.cell.distribution.width..RDW.                0.24617614 -0.264712078
##                                                               PC14        PC15
## Patient.age.quantile                                   0.008950547 -0.47063901
## Patient.addmited.to.regular.ward..1.yes..0.no.        -0.122060439  0.16866736
## Patient.addmited.to.semi.intensive.unit..1.yes..0.no. -0.124919420  0.18772569
## Patient.addmited.to.intensive.care.unit..1.yes..0.no.  0.047318666  0.11447026
## sickness                                              -0.068483647 -0.22316528
## Hematocrit                                            -0.386476142  0.26570188
## Platelets                                             -0.007964199  0.09559330
## Mean.platelet.volume                                   0.061327270  0.05032694
## Lymphocytes                                           -0.317829645 -0.07581222
## Mean.corpuscular.hemoglobin.concentration..MCHC.      -0.499722251 -0.41828489
## Leukocytes                                            -0.154994722 -0.15873162
## Basophils                                              0.195982173 -0.20349589
## Mean.corpuscular.hemoglobin..MCH.                      0.031399264  0.53393072
## Eosinophils                                           -0.033604805  0.10017841
## Monocytes                                             -0.012146442  0.01337090
## Red.blood.cell.distribution.width..RDW.               -0.627587300  0.16438432
##                                                               PC16
## Patient.age.quantile                                  -0.023737582
## Patient.addmited.to.regular.ward..1.yes..0.no.        -0.016080647
## Patient.addmited.to.semi.intensive.unit..1.yes..0.no.  0.016267896
## Patient.addmited.to.intensive.care.unit..1.yes..0.no. -0.002161518
## sickness                                              -0.078646486
## Hematocrit                                            -0.007668320
## Platelets                                             -0.543664648
## Mean.platelet.volume                                  -0.175991648
## Lymphocytes                                            0.309940816
## Mean.corpuscular.hemoglobin.concentration..MCHC.      -0.233244091
## Leukocytes                                             0.660702469
## Basophils                                              0.120917437
## Mean.corpuscular.hemoglobin..MCH.                      0.179648384
## Eosinophils                                            0.067245302
## Monocytes                                              0.139017987
## Red.blood.cell.distribution.width..RDW.               -0.091563494
biplot(clust_pcov, main = "Biplot Princomp Method", expand = 1, col = c("blue", "red"))

library(factoextra)
## Welcome! Want to learn more? See two factoextra-related books at https://goo.gl/ve3WBa
fviz_cluster(kmeans, geom = "point", data = clust_pcov$x[,1:2]) + ggtitle(" K = 2 mit PCA")

Dieser Plot der Cluster zeigt deutlich, dass sich die einzelnen Punkte in den Clustern sehr stark überschneiden, deswegen ist es durch eine solche Cluster Methode nicht möglich zu prüfen ob ein Patient gesund oder mit Corona Infiziert ist.

Nach diesem ersten Überblick würden wir davon ausgehen, dass, auch auf Grund der Unbalanciertheit der Daten in Richtung nicht Corona infiziert, die Modelle sich leichter tun eine hohe Sensitivity zu erreichen, das erkennen von Corona Infizierten jedoch schwer werden könnte. Beginnen wir nun mit der entwicklung verschiedener Modelle zur Klassifizierung:

Support Vector Machines (SMVs)

In diesem Abschnitt wird versucht mit Hilfe von Support Vector Machines (SVMs) die Klassifizierung einer Corona Erkrankung zu verbessern.

plot_1 <- ggplot(data=data_train, aes(target)) + 
  geom_bar(stat = "count")
plot_1 <- plot_1 + ggtitle("Übersicht Klassen in Trainingsdatensatz") +
  xlab("Klassen") + ylab("Anzahl Observations pro Klasse") + scale_fill_brewer(palette="Dark2") + 
  theme(plot.title = element_text(hjust = 0.5))
plot_1

Der Trainingsdatensatz ist stark unbalanciert. Das heißt, das oben beschriebene, Rare Class Problem bei der Klassifizierung liegt definitv vor.

table(data_train$target)
## 
##   0   1 
## 380  47
#set.seed(1910837262)
#up_train_svm <- upSample(x = data_train[, -ncol(data_train)],
                     #y = data_train$target)                         
#table(up_train_svm$target) 
#up_train_svm <- up_train_svm %>%
    #select(-Class)
#print(str(up_train_svm))

Um eine bessere Trainingsgrundlage für das SVM zu haben, führen wir ein Upsampling der Trainingsdaten durch. Damit beheben wir das Rare Class Problem im Trainingsdatensatz. Nun haben wir im Trainingsdatensatz jeweils 380 Corona Infizierte und 380 Nicht Corona Infizierte. Allerdings werden diese Datensaätz aufgrund des Feature Engineerings nicht mehr benötigt.

up_train_svm <- read.csv("data/clean/train_feat_eng.csv")
data_test <- read.csv("data/clean/test_feat_eng.csv")
str(up_train_svm)
## 'data.frame':    760 obs. of  16 variables:
##  $ age              : int  17 1 9 11 13 9 17 17 19 10 ...
##  $ target           : int  0 0 0 0 0 0 0 0 0 0 ...
##  $ reg_ward         : int  0 0 0 0 0 1 0 0 0 0 ...
##  $ semi_unit        : int  0 1 0 0 0 0 0 0 1 0 ...
##  $ intense_unit     : int  0 0 0 0 0 0 0 0 0 0 ...
##  $ sickness         : int  1 0 1 1 1 1 1 1 1 0 ...
##  $ Hematocrit       : num  0.237 -1.572 -0.748 0.992 1.015 ...
##  $ Platelets        : num  -0.517 1.43 -0.429 0.073 -0.178 ...
##  $ Platelets_vol    : num  0.0107 -1.6722 -0.2137 -0.5503 0.796 ...
##  $ Lymphocytes      : num  0.31837 -0.00574 -1.11451 0.04544 -0.73071 ...
##  $ mean_hemoglobin  : num  -0.951 3.331 0.543 -0.453 -0.353 ...
##  $ Leukocytes       : num  -0.0946 0.3646 -0.8849 -0.2115 -0.0751 ...
##  $ Eosinophils      : num  1.482 1.019 -0.667 -0.709 0.218 ...
##  $ Monocytes        : num  0.3575 0.0687 1.2768 -0.2202 0.0687 ...
##  $ age_plat_leuk_eos: num  19.4 5.95 18.74 17.3 17.42 ...
##  $ age_leuk_eos     : num  10.06 4.28 9.86 9.91 9.56 ...
up_train_svm <- transform_type(up_train_svm)
data_test <- transform_type(data_test)
str(up_train_svm)
## 'data.frame':    760 obs. of  16 variables:
##  $ age              : num  17 1 9 11 13 9 17 17 19 10 ...
##  $ target           : Factor w/ 2 levels "0","1": 1 1 1 1 1 1 1 1 1 1 ...
##  $ reg_ward         : Factor w/ 2 levels "0","1": 1 1 1 1 1 2 1 1 1 1 ...
##  $ semi_unit        : Factor w/ 2 levels "0","1": 1 2 1 1 1 1 1 1 2 1 ...
##  $ intense_unit     : Factor w/ 2 levels "0","1": 1 1 1 1 1 1 1 1 1 1 ...
##  $ sickness         : Factor w/ 2 levels "0","1": 2 1 2 2 2 2 2 2 2 1 ...
##  $ Hematocrit       : num  0.237 -1.572 -0.748 0.992 1.015 ...
##  $ Platelets        : num  -0.517 1.43 -0.429 0.073 -0.178 ...
##  $ Platelets_vol    : num  0.0107 -1.6722 -0.2137 -0.5503 0.796 ...
##  $ Lymphocytes      : num  0.31837 -0.00574 -1.11451 0.04544 -0.73071 ...
##  $ mean_hemoglobin  : num  -0.951 3.331 0.543 -0.453 -0.353 ...
##  $ Leukocytes       : num  -0.0946 0.3646 -0.8849 -0.2115 -0.0751 ...
##  $ Eosinophils      : num  1.482 1.019 -0.667 -0.709 0.218 ...
##  $ Monocytes        : num  0.3575 0.0687 1.2768 -0.2202 0.0687 ...
##  $ age_plat_leuk_eos: num  19.4 5.95 18.74 17.3 17.42 ...
##  $ age_leuk_eos     : num  10.06 4.28 9.86 9.91 9.56 ...
methods = list("lssvmPoly", "lssvmRadial", "svmBoundrangeString", "svmRadialWeights", "svmExpoString", "svmLinear", "svmPoly", "svmRadial", "svmRadialCos", "svmRadialSigma", "svmSpectrumString")
set.seed(1910837388)

fitControl <- trainControl(## 10-fold CV
                           method = "repeatedcv",
                           number = 10,
                           ## repeated ten times
                           repeats = 3)

svm_fit_linear <- train(target ~ ., data = up_train_svm, 
                 method = "svmLinear", 
                 trControl = fitControl,
                 verbose = FALSE)
svm_fit_linear
## Support Vector Machines with Linear Kernel 
## 
## 760 samples
##  15 predictor
##   2 classes: '0', '1' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 3 times) 
## Summary of sample sizes: 684, 684, 684, 684, 684, 684, ... 
## Resampling results:
## 
##   Accuracy   Kappa    
##   0.8587719  0.7175439
## 
## Tuning parameter 'C' was held constant at a value of 1
set.seed(1910837388)

svm_fit_linear <- train(target ~ ., data = up_train_svm, 
                 method = "svmLinear", 
                 trControl = fitControl,
                 tuneGrid = expand.grid(C = seq(0.000000001, 5, length = 50)),
                 verbose = FALSE)
svm_fit_linear
## Support Vector Machines with Linear Kernel 
## 
## 760 samples
##  15 predictor
##   2 classes: '0', '1' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 3 times) 
## Summary of sample sizes: 684, 684, 684, 684, 684, 684, ... 
## Resampling results across tuning parameters:
## 
##   C            Accuracy   Kappa    
##   0.000000001  0.7385965  0.4771930
##   0.102040817  0.8609649  0.7219298
##   0.204081634  0.8605263  0.7210526
##   0.306122450  0.8592105  0.7184211
##   0.408163266  0.8574561  0.7149123
##   0.510204083  0.8565789  0.7131579
##   0.612244899  0.8570175  0.7140351
##   0.714285715  0.8570175  0.7140351
##   0.816326531  0.8583333  0.7166667
##   0.918367348  0.8587719  0.7175439
##   1.020408164  0.8587719  0.7175439
##   1.122448980  0.8583333  0.7166667
##   1.224489797  0.8587719  0.7175439
##   1.326530613  0.8587719  0.7175439
##   1.428571429  0.8592105  0.7184211
##   1.530612246  0.8587719  0.7175439
##   1.632653062  0.8587719  0.7175439
##   1.734693878  0.8592105  0.7184211
##   1.836734695  0.8592105  0.7184211
##   1.938775511  0.8583333  0.7166667
##   2.040816327  0.8583333  0.7166667
##   2.142857143  0.8587719  0.7175439
##   2.244897960  0.8587719  0.7175439
##   2.346938776  0.8587719  0.7175439
##   2.448979592  0.8587719  0.7175439
##   2.551020409  0.8587719  0.7175439
##   2.653061225  0.8587719  0.7175439
##   2.755102041  0.8587719  0.7175439
##   2.857142858  0.8587719  0.7175439
##   2.959183674  0.8587719  0.7175439
##   3.061224490  0.8587719  0.7175439
##   3.163265306  0.8587719  0.7175439
##   3.265306123  0.8592105  0.7184211
##   3.367346939  0.8592105  0.7184211
##   3.469387755  0.8592105  0.7184211
##   3.571428572  0.8592105  0.7184211
##   3.673469388  0.8592105  0.7184211
##   3.775510204  0.8592105  0.7184211
##   3.877551021  0.8592105  0.7184211
##   3.979591837  0.8592105  0.7184211
##   4.081632653  0.8592105  0.7184211
##   4.183673470  0.8592105  0.7184211
##   4.285714286  0.8592105  0.7184211
##   4.387755102  0.8592105  0.7184211
##   4.489795918  0.8592105  0.7184211
##   4.591836735  0.8592105  0.7184211
##   4.693877551  0.8592105  0.7184211
##   4.795918367  0.8592105  0.7184211
##   4.897959184  0.8592105  0.7184211
##   5.000000000  0.8592105  0.7184211
## 
## Accuracy was used to select the optimal model using the largest value.
## The final value used for the model was C = 0.1020408.

Optimierung des Tuning Parameters C, erst mit großer Range, dann im nächsten Schritt mit kleinere Range, aber auf die Ergebnisse des ersten Tests angepasst.

set.seed(1910837388)
svm_fit_linear <- train(target ~ ., data = up_train_svm, 
                 method = "svmLinear", 
                 trControl = fitControl,
                 tuneGrid = expand.grid(C = seq(0.01, 0.3, length = 50)),
                 verbose = FALSE)
svm_fit_linear
## Support Vector Machines with Linear Kernel 
## 
## 760 samples
##  15 predictor
##   2 classes: '0', '1' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 3 times) 
## Summary of sample sizes: 684, 684, 684, 684, 684, 684, ... 
## Resampling results across tuning parameters:
## 
##   C           Accuracy   Kappa    
##   0.01000000  0.8526316  0.7052632
##   0.01591837  0.8548246  0.7096491
##   0.02183673  0.8557018  0.7114035
##   0.02775510  0.8592105  0.7184211
##   0.03367347  0.8574561  0.7149123
##   0.03959184  0.8596491  0.7192982
##   0.04551020  0.8631579  0.7263158
##   0.05142857  0.8640351  0.7280702
##   0.05734694  0.8640351  0.7280702
##   0.06326531  0.8622807  0.7245614
##   0.06918367  0.8640351  0.7280702
##   0.07510204  0.8627193  0.7254386
##   0.08102041  0.8618421  0.7236842
##   0.08693878  0.8609649  0.7219298
##   0.09285714  0.8605263  0.7210526
##   0.09877551  0.8609649  0.7219298
##   0.10469388  0.8618421  0.7236842
##   0.11061224  0.8622807  0.7245614
##   0.11653061  0.8618421  0.7236842
##   0.12244898  0.8618421  0.7236842
##   0.12836735  0.8618421  0.7236842
##   0.13428571  0.8622807  0.7245614
##   0.14020408  0.8618421  0.7236842
##   0.14612245  0.8622807  0.7245614
##   0.15204082  0.8618421  0.7236842
##   0.15795918  0.8614035  0.7228070
##   0.16387755  0.8614035  0.7228070
##   0.16979592  0.8605263  0.7210526
##   0.17571429  0.8605263  0.7210526
##   0.18163265  0.8605263  0.7210526
##   0.18755102  0.8600877  0.7201754
##   0.19346939  0.8600877  0.7201754
##   0.19938776  0.8605263  0.7210526
##   0.20530612  0.8600877  0.7201754
##   0.21122449  0.8600877  0.7201754
##   0.21714286  0.8600877  0.7201754
##   0.22306122  0.8600877  0.7201754
##   0.22897959  0.8605263  0.7210526
##   0.23489796  0.8600877  0.7201754
##   0.24081633  0.8605263  0.7210526
##   0.24673469  0.8605263  0.7210526
##   0.25265306  0.8605263  0.7210526
##   0.25857143  0.8605263  0.7210526
##   0.26448980  0.8600877  0.7201754
##   0.27040816  0.8600877  0.7201754
##   0.27632653  0.8600877  0.7201754
##   0.28224490  0.8600877  0.7201754
##   0.28816327  0.8592105  0.7184211
##   0.29408163  0.8592105  0.7184211
##   0.30000000  0.8592105  0.7184211
## 
## Accuracy was used to select the optimal model using the largest value.
## The final value used for the model was C = 0.05142857.
# Plot model accuracy vs different values of Cost
plot(svm_fit_linear)

Der Plot zeigt uns nochmal schön, was uns die Verherige Berechnung von ausgegeben hat. C = 0.0514 erzielt für das Modell die beste Accuracy auf den Trainingsdaten. Nun ermitteln wir die Accuracy für die Testdaten:

prediction_linear <- svm_fit_linear %>% predict(data_test)
mean(prediction_linear == data_test$target)
## [1] 0.8857143

Berechnen wir nun ein anderes, nicht lineares Modell, genau wie davor erstmal ohne Tuning Parameter, um uns dann anzunähern:

set.seed(1910837388)

svm_fit_radial <- train(target ~ ., data = up_train_svm, 
                 method = "svmRadial", 
                 tuneLength = 9,
                 trControl = fitControl,
                 verbose = FALSE)
svm_fit_radial
## Support Vector Machines with Radial Basis Function Kernel 
## 
## 760 samples
##  15 predictor
##   2 classes: '0', '1' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 3 times) 
## Summary of sample sizes: 684, 684, 684, 684, 684, 684, ... 
## Resampling results across tuning parameters:
## 
##   C      Accuracy   Kappa    
##    0.25  0.8912281  0.7824561
##    0.50  0.9109649  0.8219298
##    1.00  0.9228070  0.8456140
##    2.00  0.9254386  0.8508772
##    4.00  0.9276316  0.8552632
##    8.00  0.9276316  0.8552632
##   16.00  0.9364035  0.8728070
##   32.00  0.9346491  0.8692982
##   64.00  0.9337719  0.8675439
## 
## Tuning parameter 'sigma' was held constant at a value of 0.05561306
## Accuracy was used to select the optimal model using the largest value.
## The final values used for the model were sigma = 0.05561306 and C = 16.
set.seed(1910837388)
# Use the expand.grid to specify the search space   
grid <- expand.grid(sigma = seq(0.01, 0.1, length = 10),
                    C = seq(14, 18, length = 20))

svm_fit_radial <- train(target ~ ., data = up_train_svm, 
                 method = "svmRadial", 
                 tuneGrid = grid,
                 trControl = fitControl,
                 verbose = FALSE)
svm_fit_radial
## Support Vector Machines with Radial Basis Function Kernel 
## 
## 760 samples
##  15 predictor
##   2 classes: '0', '1' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 3 times) 
## Summary of sample sizes: 684, 684, 684, 684, 684, 684, ... 
## Resampling results across tuning parameters:
## 
##   sigma  C         Accuracy   Kappa    
##   0.01   14.00000  0.9087719  0.8175439
##   0.01   14.21053  0.9100877  0.8201754
##   0.01   14.42105  0.9105263  0.8210526
##   0.01   14.63158  0.9105263  0.8210526
##   0.01   14.84211  0.9100877  0.8201754
##   0.01   15.05263  0.9100877  0.8201754
##   0.01   15.26316  0.9096491  0.8192982
##   0.01   15.47368  0.9096491  0.8192982
##   0.01   15.68421  0.9096491  0.8192982
##   0.01   15.89474  0.9100877  0.8201754
##   0.01   16.10526  0.9100877  0.8201754
##   0.01   16.31579  0.9109649  0.8219298
##   0.01   16.52632  0.9109649  0.8219298
##   0.01   16.73684  0.9105263  0.8210526
##   0.01   16.94737  0.9118421  0.8236842
##   0.01   17.15789  0.9118421  0.8236842
##   0.01   17.36842  0.9131579  0.8263158
##   0.01   17.57895  0.9131579  0.8263158
##   0.01   17.78947  0.9131579  0.8263158
##   0.01   18.00000  0.9135965  0.8271930
##   0.02   14.00000  0.9258772  0.8517544
##   0.02   14.21053  0.9258772  0.8517544
##   0.02   14.42105  0.9258772  0.8517544
##   0.02   14.63158  0.9258772  0.8517544
##   0.02   14.84211  0.9263158  0.8526316
##   0.02   15.05263  0.9258772  0.8517544
##   0.02   15.26316  0.9254386  0.8508772
##   0.02   15.47368  0.9254386  0.8508772
##   0.02   15.68421  0.9254386  0.8508772
##   0.02   15.89474  0.9258772  0.8517544
##   0.02   16.10526  0.9263158  0.8526316
##   0.02   16.31579  0.9263158  0.8526316
##   0.02   16.52632  0.9263158  0.8526316
##   0.02   16.73684  0.9267544  0.8535088
##   0.02   16.94737  0.9263158  0.8526316
##   0.02   17.15789  0.9258772  0.8517544
##   0.02   17.36842  0.9258772  0.8517544
##   0.02   17.57895  0.9258772  0.8517544
##   0.02   17.78947  0.9258772  0.8517544
##   0.02   18.00000  0.9258772  0.8517544
##   0.03   14.00000  0.9293860  0.8587719
##   0.03   14.21053  0.9293860  0.8587719
##   0.03   14.42105  0.9298246  0.8596491
##   0.03   14.63158  0.9293860  0.8587719
##   0.03   14.84211  0.9293860  0.8587719
##   0.03   15.05263  0.9293860  0.8587719
##   0.03   15.26316  0.9293860  0.8587719
##   0.03   15.47368  0.9289474  0.8578947
##   0.03   15.68421  0.9280702  0.8561404
##   0.03   15.89474  0.9271930  0.8543860
##   0.03   16.10526  0.9267544  0.8535088
##   0.03   16.31579  0.9267544  0.8535088
##   0.03   16.52632  0.9263158  0.8526316
##   0.03   16.73684  0.9267544  0.8535088
##   0.03   16.94737  0.9254386  0.8508772
##   0.03   17.15789  0.9254386  0.8508772
##   0.03   17.36842  0.9254386  0.8508772
##   0.03   17.57895  0.9254386  0.8508772
##   0.03   17.78947  0.9254386  0.8508772
##   0.03   18.00000  0.9258772  0.8517544
##   0.04   14.00000  0.9267544  0.8535088
##   0.04   14.21053  0.9267544  0.8535088
##   0.04   14.42105  0.9267544  0.8535088
##   0.04   14.63158  0.9263158  0.8526316
##   0.04   14.84211  0.9271930  0.8543860
##   0.04   15.05263  0.9280702  0.8561404
##   0.04   15.26316  0.9280702  0.8561404
##   0.04   15.47368  0.9280702  0.8561404
##   0.04   15.68421  0.9280702  0.8561404
##   0.04   15.89474  0.9280702  0.8561404
##   0.04   16.10526  0.9289474  0.8578947
##   0.04   16.31579  0.9293860  0.8587719
##   0.04   16.52632  0.9289474  0.8578947
##   0.04   16.73684  0.9285088  0.8570175
##   0.04   16.94737  0.9293860  0.8587719
##   0.04   17.15789  0.9293860  0.8587719
##   0.04   17.36842  0.9298246  0.8596491
##   0.04   17.57895  0.9293860  0.8587719
##   0.04   17.78947  0.9293860  0.8587719
##   0.04   18.00000  0.9298246  0.8596491
##   0.05   14.00000  0.9311404  0.8622807
##   0.05   14.21053  0.9311404  0.8622807
##   0.05   14.42105  0.9311404  0.8622807
##   0.05   14.63158  0.9315789  0.8631579
##   0.05   14.84211  0.9320175  0.8640351
##   0.05   15.05263  0.9320175  0.8640351
##   0.05   15.26316  0.9320175  0.8640351
##   0.05   15.47368  0.9320175  0.8640351
##   0.05   15.68421  0.9333333  0.8666667
##   0.05   15.89474  0.9342105  0.8684211
##   0.05   16.10526  0.9350877  0.8701754
##   0.05   16.31579  0.9350877  0.8701754
##   0.05   16.52632  0.9355263  0.8710526
##   0.05   16.73684  0.9355263  0.8710526
##   0.05   16.94737  0.9355263  0.8710526
##   0.05   17.15789  0.9355263  0.8710526
##   0.05   17.36842  0.9350877  0.8701754
##   0.05   17.57895  0.9346491  0.8692982
##   0.05   17.78947  0.9346491  0.8692982
##   0.05   18.00000  0.9342105  0.8684211
##   0.06   14.00000  0.9364035  0.8728070
##   0.06   14.21053  0.9372807  0.8745614
##   0.06   14.42105  0.9359649  0.8719298
##   0.06   14.63158  0.9359649  0.8719298
##   0.06   14.84211  0.9359649  0.8719298
##   0.06   15.05263  0.9364035  0.8728070
##   0.06   15.26316  0.9364035  0.8728070
##   0.06   15.47368  0.9364035  0.8728070
##   0.06   15.68421  0.9364035  0.8728070
##   0.06   15.89474  0.9368421  0.8736842
##   0.06   16.10526  0.9372807  0.8745614
##   0.06   16.31579  0.9381579  0.8763158
##   0.06   16.52632  0.9385965  0.8771930
##   0.06   16.73684  0.9385965  0.8771930
##   0.06   16.94737  0.9385965  0.8771930
##   0.06   17.15789  0.9390351  0.8780702
##   0.06   17.36842  0.9390351  0.8780702
##   0.06   17.57895  0.9385965  0.8771930
##   0.06   17.78947  0.9390351  0.8780702
##   0.06   18.00000  0.9390351  0.8780702
##   0.07   14.00000  0.9377193  0.8754386
##   0.07   14.21053  0.9377193  0.8754386
##   0.07   14.42105  0.9385965  0.8771930
##   0.07   14.63158  0.9385965  0.8771930
##   0.07   14.84211  0.9385965  0.8771930
##   0.07   15.05263  0.9381579  0.8763158
##   0.07   15.26316  0.9381579  0.8763158
##   0.07   15.47368  0.9381579  0.8763158
##   0.07   15.68421  0.9377193  0.8754386
##   0.07   15.89474  0.9377193  0.8754386
##   0.07   16.10526  0.9372807  0.8745614
##   0.07   16.31579  0.9372807  0.8745614
##   0.07   16.52632  0.9368421  0.8736842
##   0.07   16.73684  0.9368421  0.8736842
##   0.07   16.94737  0.9359649  0.8719298
##   0.07   17.15789  0.9359649  0.8719298
##   0.07   17.36842  0.9355263  0.8710526
##   0.07   17.57895  0.9355263  0.8710526
##   0.07   17.78947  0.9355263  0.8710526
##   0.07   18.00000  0.9355263  0.8710526
##   0.08   14.00000  0.9355263  0.8710526
##   0.08   14.21053  0.9355263  0.8710526
##   0.08   14.42105  0.9355263  0.8710526
##   0.08   14.63158  0.9350877  0.8701754
##   0.08   14.84211  0.9350877  0.8701754
##   0.08   15.05263  0.9350877  0.8701754
##   0.08   15.26316  0.9350877  0.8701754
##   0.08   15.47368  0.9346491  0.8692982
##   0.08   15.68421  0.9342105  0.8684211
##   0.08   15.89474  0.9342105  0.8684211
##   0.08   16.10526  0.9346491  0.8692982
##   0.08   16.31579  0.9346491  0.8692982
##   0.08   16.52632  0.9346491  0.8692982
##   0.08   16.73684  0.9346491  0.8692982
##   0.08   16.94737  0.9346491  0.8692982
##   0.08   17.15789  0.9346491  0.8692982
##   0.08   17.36842  0.9346491  0.8692982
##   0.08   17.57895  0.9350877  0.8701754
##   0.08   17.78947  0.9350877  0.8701754
##   0.08   18.00000  0.9346491  0.8692982
##   0.09   14.00000  0.9346491  0.8692982
##   0.09   14.21053  0.9346491  0.8692982
##   0.09   14.42105  0.9346491  0.8692982
##   0.09   14.63158  0.9342105  0.8684211
##   0.09   14.84211  0.9342105  0.8684211
##   0.09   15.05263  0.9346491  0.8692982
##   0.09   15.26316  0.9342105  0.8684211
##   0.09   15.47368  0.9346491  0.8692982
##   0.09   15.68421  0.9350877  0.8701754
##   0.09   15.89474  0.9346491  0.8692982
##   0.09   16.10526  0.9350877  0.8701754
##   0.09   16.31579  0.9350877  0.8701754
##   0.09   16.52632  0.9350877  0.8701754
##   0.09   16.73684  0.9350877  0.8701754
##   0.09   16.94737  0.9350877  0.8701754
##   0.09   17.15789  0.9350877  0.8701754
##   0.09   17.36842  0.9350877  0.8701754
##   0.09   17.57895  0.9350877  0.8701754
##   0.09   17.78947  0.9355263  0.8710526
##   0.09   18.00000  0.9355263  0.8710526
##   0.10   14.00000  0.9359649  0.8719298
##   0.10   14.21053  0.9359649  0.8719298
##   0.10   14.42105  0.9359649  0.8719298
##   0.10   14.63158  0.9359649  0.8719298
##   0.10   14.84211  0.9359649  0.8719298
##   0.10   15.05263  0.9359649  0.8719298
##   0.10   15.26316  0.9359649  0.8719298
##   0.10   15.47368  0.9359649  0.8719298
##   0.10   15.68421  0.9355263  0.8710526
##   0.10   15.89474  0.9355263  0.8710526
##   0.10   16.10526  0.9355263  0.8710526
##   0.10   16.31579  0.9355263  0.8710526
##   0.10   16.52632  0.9355263  0.8710526
##   0.10   16.73684  0.9355263  0.8710526
##   0.10   16.94737  0.9355263  0.8710526
##   0.10   17.15789  0.9355263  0.8710526
##   0.10   17.36842  0.9359649  0.8719298
##   0.10   17.57895  0.9364035  0.8728070
##   0.10   17.78947  0.9364035  0.8728070
##   0.10   18.00000  0.9364035  0.8728070
## 
## Accuracy was used to select the optimal model using the largest value.
## The final values used for the model were sigma = 0.06 and C = 17.15789.
# Plot model accuracy vs different values of Cost
plot(svm_fit_radial)

prediction_radial <- svm_fit_radial %>% predict(data_test)
mean(prediction_radial == data_test$target)
## [1] 0.9047619
set.seed(1910837388)

svm_fit_poly <- train(target ~ ., data = up_train_svm, 
                 method = "svmPoly", 
                 tuneLength = 4,
                 trControl = fitControl,
                 verbose = FALSE)
svm_fit_poly
## Support Vector Machines with Polynomial Kernel 
## 
## 760 samples
##  15 predictor
##   2 classes: '0', '1' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold, repeated 3 times) 
## Summary of sample sizes: 684, 684, 684, 684, 684, 684, ... 
## Resampling results across tuning parameters:
## 
##   degree  scale  C     Accuracy   Kappa    
##   1       0.001  0.25  0.7815789  0.5631579
##   1       0.001  0.50  0.8245614  0.6491228
##   1       0.001  1.00  0.8359649  0.6719298
##   1       0.001  2.00  0.8394737  0.6789474
##   1       0.010  0.25  0.8416667  0.6833333
##   1       0.010  0.50  0.8478070  0.6956140
##   1       0.010  1.00  0.8526316  0.7052632
##   1       0.010  2.00  0.8561404  0.7122807
##   1       0.100  0.25  0.8557018  0.7114035
##   1       0.100  0.50  0.8640351  0.7280702
##   1       0.100  1.00  0.8614035  0.7228070
##   1       0.100  2.00  0.8605263  0.7210526
##   1       1.000  0.25  0.8605263  0.7210526
##   1       1.000  0.50  0.8565789  0.7131579
##   1       1.000  1.00  0.8587719  0.7175439
##   1       1.000  2.00  0.8587719  0.7175439
##   2       0.001  0.25  0.8245614  0.6491228
##   2       0.001  0.50  0.8359649  0.6719298
##   2       0.001  1.00  0.8403509  0.6807018
##   2       0.001  2.00  0.8434211  0.6868421
##   2       0.010  0.25  0.8482456  0.6964912
##   2       0.010  0.50  0.8561404  0.7122807
##   2       0.010  1.00  0.8640351  0.7280702
##   2       0.010  2.00  0.8728070  0.7456140
##   2       0.100  0.25  0.9109649  0.8219298
##   2       0.100  0.50  0.9157895  0.8315789
##   2       0.100  1.00  0.9236842  0.8473684
##   2       0.100  2.00  0.9254386  0.8508772
##   2       1.000  0.25  0.9377193  0.8754386
##   2       1.000  0.50  0.9280702  0.8561404
##   2       1.000  1.00  0.9241228  0.8482456
##   2       1.000  2.00  0.9241228  0.8482456
##   3       0.001  0.25  0.8280702  0.6561404
##   3       0.001  0.50  0.8429825  0.6859649
##   3       0.001  1.00  0.8403509  0.6807018
##   3       0.001  2.00  0.8460526  0.6921053
##   3       0.010  0.25  0.8583333  0.7166667
##   3       0.010  0.50  0.8640351  0.7280702
##   3       0.010  1.00  0.8811404  0.7622807
##   3       0.010  2.00  0.8864035  0.7728070
##   3       0.100  0.25  0.9228070  0.8456140
##   3       0.100  0.50  0.9271930  0.8543860
##   3       0.100  1.00  0.9250000  0.8500000
##   3       0.100  2.00  0.9223684  0.8447368
##   3       1.000  0.25  0.9122807  0.8245614
##   3       1.000  0.50  0.9149123  0.8298246
##   3       1.000  1.00  0.9162281  0.8324561
##   3       1.000  2.00  0.9166667  0.8333333
## 
## Accuracy was used to select the optimal model using the largest value.
## The final values used for the model were degree = 2, scale = 1 and C = 0.25.

The final values used for the model were degree = 2, scale = 1 and C = 1.

#set.seed(1910837388)

#grid <- expand.grid(degree = seq(1, 4, length = 4),
                    #scale = seq(0.5, 2, length = 10),
                    #C = seq(0.5, 2, length = 10))

#svm_fit_poly <- train(target ~ ., data = up_train_svm, 
                 #method = "svmPoly", 
                 #tuneGrid = grid,
                 #trControl = fitControl,
                 #verbose = FALSE)
#svm_fit_poly

The final values used for the model were degree = 2, scale = 0.66667 and C = 0.5.

set.seed(1910837388)

grid <- expand.grid(degree = seq(2, 2, length = 1),
                    scale = seq(0.5, 0.7, length = 10),
                    C = seq(0.4, 0.6, length = 10))

svm_fit_poly <- train(target ~ ., data = up_train_svm, 
                 method = "svmPoly", 
                 tuneGrid = grid,
                 preProc = c("center","scale"),
                 trControl = fitControl,
                 verbose = FALSE)
svm_fit_poly
## Support Vector Machines with Polynomial Kernel 
## 
## 760 samples
##  15 predictor
##   2 classes: '0', '1' 
## 
## Pre-processing: centered (15), scaled (15) 
## Resampling: Cross-Validated (10 fold, repeated 3 times) 
## Summary of sample sizes: 684, 684, 684, 684, 684, 684, ... 
## Resampling results across tuning parameters:
## 
##   scale      C          Accuracy   Kappa    
##   0.5000000  0.4000000  0.9298246  0.8596491
##   0.5000000  0.4222222  0.9302632  0.8605263
##   0.5000000  0.4444444  0.9293860  0.8587719
##   0.5000000  0.4666667  0.9311404  0.8622807
##   0.5000000  0.4888889  0.9324561  0.8649123
##   0.5000000  0.5111111  0.9320175  0.8640351
##   0.5000000  0.5333333  0.9350877  0.8701754
##   0.5000000  0.5555556  0.9320175  0.8640351
##   0.5000000  0.5777778  0.9320175  0.8640351
##   0.5000000  0.6000000  0.9333333  0.8666667
##   0.5222222  0.4000000  0.9298246  0.8596491
##   0.5222222  0.4222222  0.9302632  0.8605263
##   0.5222222  0.4444444  0.9324561  0.8649123
##   0.5222222  0.4666667  0.9320175  0.8640351
##   0.5222222  0.4888889  0.9346491  0.8692982
##   0.5222222  0.5111111  0.9315789  0.8631579
##   0.5222222  0.5333333  0.9324561  0.8649123
##   0.5222222  0.5555556  0.9342105  0.8684211
##   0.5222222  0.5777778  0.9337719  0.8675439
##   0.5222222  0.6000000  0.9337719  0.8675439
##   0.5444444  0.4000000  0.9324561  0.8649123
##   0.5444444  0.4222222  0.9320175  0.8640351
##   0.5444444  0.4444444  0.9346491  0.8692982
##   0.5444444  0.4666667  0.9311404  0.8622807
##   0.5444444  0.4888889  0.9320175  0.8640351
##   0.5444444  0.5111111  0.9337719  0.8675439
##   0.5444444  0.5333333  0.9342105  0.8684211
##   0.5444444  0.5555556  0.9342105  0.8684211
##   0.5444444  0.5777778  0.9337719  0.8675439
##   0.5444444  0.6000000  0.9350877  0.8701754
##   0.5666667  0.4000000  0.9324561  0.8649123
##   0.5666667  0.4222222  0.9342105  0.8684211
##   0.5666667  0.4444444  0.9311404  0.8622807
##   0.5666667  0.4666667  0.9333333  0.8666667
##   0.5666667  0.4888889  0.9346491  0.8692982
##   0.5666667  0.5111111  0.9342105  0.8684211
##   0.5666667  0.5333333  0.9337719  0.8675439
##   0.5666667  0.5555556  0.9350877  0.8701754
##   0.5666667  0.5777778  0.9364035  0.8728070
##   0.5666667  0.6000000  0.9372807  0.8745614
##   0.5888889  0.4000000  0.9320175  0.8640351
##   0.5888889  0.4222222  0.9324561  0.8649123
##   0.5888889  0.4444444  0.9337719  0.8675439
##   0.5888889  0.4666667  0.9342105  0.8684211
##   0.5888889  0.4888889  0.9333333  0.8666667
##   0.5888889  0.5111111  0.9346491  0.8692982
##   0.5888889  0.5333333  0.9364035  0.8728070
##   0.5888889  0.5555556  0.9368421  0.8736842
##   0.5888889  0.5777778  0.9385965  0.8771930
##   0.5888889  0.6000000  0.9385965  0.8771930
##   0.6111111  0.4000000  0.9328947  0.8657895
##   0.6111111  0.4222222  0.9342105  0.8684211
##   0.6111111  0.4444444  0.9346491  0.8692982
##   0.6111111  0.4666667  0.9342105  0.8684211
##   0.6111111  0.4888889  0.9355263  0.8710526
##   0.6111111  0.5111111  0.9372807  0.8745614
##   0.6111111  0.5333333  0.9385965  0.8771930
##   0.6111111  0.5555556  0.9381579  0.8763158
##   0.6111111  0.5777778  0.9385965  0.8771930
##   0.6111111  0.6000000  0.9390351  0.8780702
##   0.6333333  0.4000000  0.9342105  0.8684211
##   0.6333333  0.4222222  0.9346491  0.8692982
##   0.6333333  0.4444444  0.9346491  0.8692982
##   0.6333333  0.4666667  0.9364035  0.8728070
##   0.6333333  0.4888889  0.9377193  0.8754386
##   0.6333333  0.5111111  0.9381579  0.8763158
##   0.6333333  0.5333333  0.9390351  0.8780702
##   0.6333333  0.5555556  0.9390351  0.8780702
##   0.6333333  0.5777778  0.9394737  0.8789474
##   0.6333333  0.6000000  0.9399123  0.8798246
##   0.6555556  0.4000000  0.9337719  0.8675439
##   0.6555556  0.4222222  0.9350877  0.8701754
##   0.6555556  0.4444444  0.9372807  0.8745614
##   0.6555556  0.4666667  0.9385965  0.8771930
##   0.6555556  0.4888889  0.9381579  0.8763158
##   0.6555556  0.5111111  0.9390351  0.8780702
##   0.6555556  0.5333333  0.9394737  0.8789474
##   0.6555556  0.5555556  0.9385965  0.8771930
##   0.6555556  0.5777778  0.9390351  0.8780702
##   0.6555556  0.6000000  0.9377193  0.8754386
##   0.6777778  0.4000000  0.9355263  0.8710526
##   0.6777778  0.4222222  0.9368421  0.8736842
##   0.6777778  0.4444444  0.9385965  0.8771930
##   0.6777778  0.4666667  0.9385965  0.8771930
##   0.6777778  0.4888889  0.9394737  0.8789474
##   0.6777778  0.5111111  0.9390351  0.8780702
##   0.6777778  0.5333333  0.9390351  0.8780702
##   0.6777778  0.5555556  0.9385965  0.8771930
##   0.6777778  0.5777778  0.9372807  0.8745614
##   0.6777778  0.6000000  0.9359649  0.8719298
##   0.7000000  0.4000000  0.9368421  0.8736842
##   0.7000000  0.4222222  0.9381579  0.8763158
##   0.7000000  0.4444444  0.9390351  0.8780702
##   0.7000000  0.4666667  0.9394737  0.8789474
##   0.7000000  0.4888889  0.9390351  0.8780702
##   0.7000000  0.5111111  0.9390351  0.8780702
##   0.7000000  0.5333333  0.9377193  0.8754386
##   0.7000000  0.5555556  0.9364035  0.8728070
##   0.7000000  0.5777778  0.9359649  0.8719298
##   0.7000000  0.6000000  0.9355263  0.8710526
## 
## Tuning parameter 'degree' was held constant at a value of 2
## Accuracy was used to select the optimal model using the largest value.
## The final values used for the model were degree = 2, scale = 0.6333333 and C
##  = 0.6.

The final values used for the model were degree = 2, scale = 0.633333 and C = 0.6.

plot(svm_fit_poly)

prediction_poly <- svm_fit_poly %>% predict(data_test)
mean(prediction_poly == data_test$target)
## [1] 0.8761905
set.seed(1910837388)

svm_fit_RSigma <- train(target ~ ., data = up_train_svm, 
                 method = "svmRadialSigma",
                 trControl = trainControl(method = "cv"),
                 verbose = FALSE)
svm_fit_RSigma
## Support Vector Machines with Radial Basis Function Kernel 
## 
## 760 samples
##  15 predictor
##   2 classes: '0', '1' 
## 
## No pre-processing
## Resampling: Cross-Validated (10 fold) 
## Summary of sample sizes: 684, 684, 684, 684, 684, 684, ... 
## Resampling results across tuning parameters:
## 
##   sigma       C     Accuracy   Kappa    
##   0.01930794  0.25  0.8605263  0.7210526
##   0.01930794  0.50  0.8605263  0.7210526
##   0.01930794  1.00  0.8842105  0.7684211
##   0.05561306  0.25  0.8894737  0.7789474
##   0.05561306  0.50  0.9078947  0.8157895
##   0.05561306  1.00  0.9210526  0.8421053
##   0.09191819  0.25  0.9118421  0.8236842
##   0.09191819  0.50  0.9250000  0.8500000
##   0.09191819  1.00  0.9236842  0.8473684
## 
## Accuracy was used to select the optimal model using the largest value.
## The final values used for the model were sigma = 0.09191819 and C = 0.5.
prediction_RSigma <- svm_fit_RSigma %>% predict(data_test)
mean(prediction_RSigma == data_test$target)
## [1] 0.9714286
plot(svm_fit_RSigma)

results_svm_linear <- data.frame(actual = data_test$target, prediction = prediction_linear)
results_svm_radial <- data.frame(actual = data_test$target, prediction = prediction_radial)
results_svm_poly <- data.frame(actual = data_test$target, prediction = prediction_poly)
results_svm_RS <- data.frame(actual = data_test$target, prediction = prediction_RSigma)
CM_linear <- confusionMatrix(table(results_svm_linear$actual,results_svm_linear$prediction))
CM_radial <- confusionMatrix(table(results_svm_radial$actual,results_svm_radial$prediction))
CM_poly <- confusionMatrix(table(results_svm_poly$actual,results_svm_poly$prediction))
CM_RS <- confusionMatrix(table(results_svm_RS$actual,results_svm_RS$prediction))
CM_linear
## Confusion Matrix and Statistics
## 
##    
##      0  1
##   0 84 10
##   1  2  9
##                                           
##                Accuracy : 0.8857          
##                  95% CI : (0.8089, 0.9395)
##     No Information Rate : 0.819           
##     P-Value [Acc > NIR] : 0.04408         
##                                           
##                   Kappa : 0.5388          
##                                           
##  Mcnemar's Test P-Value : 0.04331         
##                                           
##             Sensitivity : 0.9767          
##             Specificity : 0.4737          
##          Pos Pred Value : 0.8936          
##          Neg Pred Value : 0.8182          
##              Prevalence : 0.8190          
##          Detection Rate : 0.8000          
##    Detection Prevalence : 0.8952          
##       Balanced Accuracy : 0.7252          
##                                           
##        'Positive' Class : 0               
## 
CM_radial
## Confusion Matrix and Statistics
## 
##    
##      0  1
##   0 86  8
##   1  2  9
##                                           
##                Accuracy : 0.9048          
##                  95% CI : (0.8318, 0.9534)
##     No Information Rate : 0.8381          
##     P-Value [Acc > NIR] : 0.0362          
##                                           
##                   Kappa : 0.5908          
##                                           
##  Mcnemar's Test P-Value : 0.1138          
##                                           
##             Sensitivity : 0.9773          
##             Specificity : 0.5294          
##          Pos Pred Value : 0.9149          
##          Neg Pred Value : 0.8182          
##              Prevalence : 0.8381          
##          Detection Rate : 0.8190          
##    Detection Prevalence : 0.8952          
##       Balanced Accuracy : 0.7533          
##                                           
##        'Positive' Class : 0               
## 
CM_poly
## Confusion Matrix and Statistics
## 
##    
##      0  1
##   0 84 10
##   1  3  8
##                                           
##                Accuracy : 0.8762          
##                  95% CI : (0.7976, 0.9324)
##     No Information Rate : 0.8286          
##     P-Value [Acc > NIR] : 0.11937         
##                                           
##                   Kappa : 0.4847          
##                                           
##  Mcnemar's Test P-Value : 0.09609         
##                                           
##             Sensitivity : 0.9655          
##             Specificity : 0.4444          
##          Pos Pred Value : 0.8936          
##          Neg Pred Value : 0.7273          
##              Prevalence : 0.8286          
##          Detection Rate : 0.8000          
##    Detection Prevalence : 0.8952          
##       Balanced Accuracy : 0.7050          
##                                           
##        'Positive' Class : 0               
## 
CM_RS
## Confusion Matrix and Statistics
## 
##    
##      0  1
##   0 93  1
##   1  2  9
##                                           
##                Accuracy : 0.9714          
##                  95% CI : (0.9188, 0.9941)
##     No Information Rate : 0.9048          
##     P-Value [Acc > NIR] : 0.007949        
##                                           
##                   Kappa : 0.8413          
##                                           
##  Mcnemar's Test P-Value : 1.000000        
##                                           
##             Sensitivity : 0.9789          
##             Specificity : 0.9000          
##          Pos Pred Value : 0.9894          
##          Neg Pred Value : 0.8182          
##              Prevalence : 0.9048          
##          Detection Rate : 0.8857          
##    Detection Prevalence : 0.8952          
##       Balanced Accuracy : 0.9395          
##                                           
##        'Positive' Class : 0               
## 

Wenn wir uns die Confusion Matrizen der finalen Kernel mit optimierten Parametern anschauen, dann fällt uns auf, dass, unabhängig vom Kernel, die Specifity sehr niedrig ist, außer bei dem Radial Sigma Kernel und somit viele Positive Erkrankte nicht als solche erkannt werden. Im Vergleich mit den anderen Kernel scheidet der Radial Sigma Kernel jedoch deutlich besser ab, sowohl bei Accuracy, als auch bei Sensivity und Specivity. Schauen wir uns im nächsten Schritt noch eine übersichtlichere Tabelle an:

results <- resamples(list(Linear=svm_fit_linear, Radial=svm_fit_radial, Polynomial=svm_fit_poly))
results$values
##       Resample Linear~Accuracy Linear~Kappa Radial~Accuracy Radial~Kappa
## 1  Fold01.Rep1       0.8947368    0.7894737       0.9605263    0.9210526
## 2  Fold01.Rep2       0.9210526    0.8421053       0.9605263    0.9210526
## 3  Fold01.Rep3       0.8421053    0.6842105       0.9210526    0.8421053
## 4  Fold02.Rep1       0.8552632    0.7105263       0.9210526    0.8421053
## 5  Fold02.Rep2       0.8815789    0.7631579       0.9605263    0.9210526
## 6  Fold02.Rep3       0.9210526    0.8421053       0.9736842    0.9473684
## 7  Fold03.Rep1       0.9078947    0.8157895       0.9605263    0.9210526
## 8  Fold03.Rep2       0.8421053    0.6842105       0.9078947    0.8157895
## 9  Fold03.Rep3       0.9078947    0.8157895       0.9736842    0.9473684
## 10 Fold04.Rep1       0.8552632    0.7105263       0.8289474    0.6578947
## 11 Fold04.Rep2       0.8815789    0.7631579       0.9210526    0.8421053
## 12 Fold04.Rep3       0.7894737    0.5789474       0.8815789    0.7631579
## 13 Fold05.Rep1       0.8289474    0.6578947       0.9078947    0.8157895
## 14 Fold05.Rep2       0.9473684    0.8947368       0.9473684    0.8947368
## 15 Fold05.Rep3       0.8947368    0.7894737       0.9342105    0.8684211
## 16 Fold06.Rep1       0.8815789    0.7631579       0.9736842    0.9473684
## 17 Fold06.Rep2       0.7763158    0.5526316       0.9342105    0.8684211
## 18 Fold06.Rep3       0.7894737    0.5789474       0.9210526    0.8421053
## 19 Fold07.Rep1       0.9078947    0.8157895       1.0000000    1.0000000
## 20 Fold07.Rep2       0.8421053    0.6842105       0.9605263    0.9210526
## 21 Fold07.Rep3       0.8289474    0.6578947       0.9210526    0.8421053
## 22 Fold08.Rep1       0.9078947    0.8157895       0.9210526    0.8421053
## 23 Fold08.Rep2       0.8289474    0.6578947       0.9473684    0.8947368
## 24 Fold08.Rep3       0.8815789    0.7631579       0.9868421    0.9736842
## 25 Fold09.Rep1       0.8157895    0.6315789       0.9868421    0.9736842
## 26 Fold09.Rep2       0.8947368    0.7894737       0.9342105    0.8684211
## 27 Fold09.Rep3       0.9078947    0.8157895       0.9736842    0.9473684
## 28 Fold10.Rep1       0.8157895    0.6315789       0.9473684    0.8947368
## 29 Fold10.Rep2       0.8026316    0.6052632       0.8815789    0.7631579
## 30 Fold10.Rep3       0.8684211    0.7368421       0.9210526    0.8421053
##    Polynomial~Accuracy Polynomial~Kappa
## 1            0.9605263        0.9210526
## 2            0.9605263        0.9210526
## 3            0.8947368        0.7894737
## 4            0.9342105        0.8684211
## 5            0.9868421        0.9736842
## 6            0.9736842        0.9473684
## 7            0.9342105        0.8684211
## 8            0.9473684        0.8947368
## 9            0.9736842        0.9473684
## 10           0.8421053        0.6842105
## 11           0.9078947        0.8157895
## 12           0.8684211        0.7368421
## 13           0.9210526        0.8421053
## 14           0.9736842        0.9473684
## 15           0.9210526        0.8421053
## 16           0.9736842        0.9473684
## 17           0.9210526        0.8421053
## 18           0.9210526        0.8421053
## 19           1.0000000        1.0000000
## 20           0.9605263        0.9210526
## 21           0.9078947        0.8157895
## 22           0.9210526        0.8421053
## 23           0.9473684        0.8947368
## 24           0.9868421        0.9736842
## 25           1.0000000        1.0000000
## 26           0.9210526        0.8421053
## 27           0.9736842        0.9473684
## 28           0.9473684        0.8947368
## 29           0.9078947        0.8157895
## 30           0.9078947        0.8157895
summary(results)
## 
## Call:
## summary.resamples(object = results)
## 
## Models: Linear, Radial, Polynomial 
## Number of resamples: 30 
## 
## Accuracy 
##                 Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## Linear     0.7763158 0.8289474 0.8750000 0.8640351 0.9046053 0.9473684    0
## Radial     0.8289474 0.9210526 0.9407895 0.9390351 0.9605263 1.0000000    0
## Polynomial 0.8421053 0.9210526 0.9407895 0.9399123 0.9736842 1.0000000    0
## 
## Kappa 
##                 Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## Linear     0.5526316 0.6578947 0.7500000 0.7280702 0.8092105 0.8947368    0
## Radial     0.6578947 0.8421053 0.8815789 0.8780702 0.9210526 1.0000000    0
## Polynomial 0.6842105 0.8421053 0.8815789 0.8798246 0.9473684 1.0000000    0
bwplot(results, metric="Accuracy")

Hier nochmal ein Vergleich der Accuracy der verschiedenen Kernel auf die Trainingsdaten. Wie man sieht ist hier der Radial Kernel der präziseste.

modell <- c("SVM Linear", "SVM Radial", "SVM Polynomial", "SVM RadialSigma")
accuracies <- c(CM_linear$overall[1], CM_radial$overall[1], CM_poly$overall[1], CM_RS$overall[1])
sensitivities <- c(CM_linear$byClass[1], CM_radial$byClass[1], CM_poly$byClass[1], CM_RS$byClass[1])
specificities <- c(CM_linear$byClass[2], CM_radial$byClass[2], CM_poly$byClass[2], CM_RS$byClass[2])
results_svm = data.frame(
  "Modell" = modell,
  "Sensitivity" = sensitivities,
  "Specificity" = specificities,
  "Test Accuracy" = accuracies
)

kable_styling(kable(results_svm, format = "html", digits = 4), full_width = FALSE)
Modell Sensitivity Specificity Test.Accuracy
SVM Linear 0.9767 0.4737 0.8857
SVM Radial 0.9773 0.5294 0.9048
SVM Polynomial 0.9655 0.4444 0.8762
SVM RadialSigma 0.9789 0.9000 0.9714

Wie man in der abschließenden Ergbnistabelle erkennen kann, performt das Radial Sigma Modell bzgl. Sensitivity ähnlich wie die anderen Modelle, hat aber eine weit höhere Specivity, was zu einer massiv besseren Accuracy führt. Dadurch ist dieses Modell auf jeden Fall zu Präferieren auf seiten der SVMs

Neuronale Netze

In diesem Abschnitt wird versucht mit Hilfe von verschiedenen Neuronalen Netzen die Vorhersage einer Corona Krankheit zu verbessern.

data_train <- data_clean[train_idx, ]
data_test <- data_clean[-train_idx, ]
ggplot(data=data_train, aes(data_train$target)) + 
  geom_histogram(stat = "count")
## Warning: Ignoring unknown parameters: binwidth, bins, pad

Die Responsevariable im Trainingsdatensatz weist eine ziemlich starke Imbalance aus, es liegt somit das bekannte Rare Class Problem bei der Klassifikation vor.

table(data_train$target)
## 
##   0   1 
## 380  47

Um eine bessere Trainingsgrundlage für das Neuronale Netz zu haben, führen wir ein Upsampling der Trainingsdaten durch. Damit beheben wir das Rare Class Problem im Trainingsdatensatz. Nun haben wir im Trainingsdatensatz jeweils 1062 Corona Infizierte und 1062 Nicht Corona Infizierte.

set.seed(1910837262)
up_train_nn <- upSample(x = data_train[, -ncol(data_train)],
                     y = as.factor(data_train$target))                         
table(up_train_nn$target) 
## 
##   0   1 
## 380 380
up_train_nn <- up_train_nn %>%
    select(-Class)
print(str(up_train_nn))
## 'data.frame':    760 obs. of  16 variables:
##  $ Patient.age.quantile                                 : num  17 1 9 11 13 9 17 17 19 10 ...
##  $ target                                               : Factor w/ 2 levels "0","1": 1 1 1 1 1 1 1 1 1 1 ...
##  $ Patient.addmited.to.regular.ward..1.yes..0.no.       : Factor w/ 2 levels "0","1": 1 1 1 1 1 2 1 1 1 1 ...
##  $ Patient.addmited.to.semi.intensive.unit..1.yes..0.no.: Factor w/ 2 levels "0","1": 1 2 1 1 1 1 1 1 2 1 ...
##  $ Patient.addmited.to.intensive.care.unit..1.yes..0.no.: Factor w/ 2 levels "0","1": 1 1 1 1 1 1 1 1 1 1 ...
##  $ sickness                                             : Factor w/ 2 levels "0","1": 2 1 2 2 2 2 2 2 2 1 ...
##  $ Hematocrit                                           : num  0.237 -1.572 -0.748 0.992 1.015 ...
##  $ Platelets                                            : num  -0.517 1.43 -0.429 0.073 -0.178 ...
##  $ Mean.platelet.volume                                 : num  0.0107 -1.6722 -0.2137 -0.5503 0.796 ...
##  $ Lymphocytes                                          : num  0.31837 -0.00574 -1.11451 0.04544 -0.73071 ...
##  $ Mean.corpuscular.hemoglobin.concentration..MCHC.     : num  -0.951 3.331 0.543 -0.453 -0.353 ...
##  $ Leukocytes                                           : num  -0.0946 0.3646 -0.8849 -0.2115 -0.0751 ...
##  $ Basophils                                            : num  -0.2238 -0.2238 0.0817 -0.8347 2.5254 ...
##  $ Mean.corpuscular.hemoglobin..MCH.                    : num  -0.292 0.178 1.746 0.335 0.544 ...
##  $ Eosinophils                                          : num  1.482 1.019 -0.667 -0.709 0.218 ...
##  $ Monocytes                                            : num  0.3575 0.0687 1.2768 -0.2202 0.0687 ...
## NULL
data_test_x <- data_test %>%
  select(-target)

Bevor wir das Neuronale Netz trainieren müssen wir zu erst die Inputdaten als zu einer Matrix umwandeln.

library(neuralnet)
## 
## Attaching package: 'neuralnet'
## The following object is masked from 'package:dplyr':
## 
##     compute
#preprocessParams <- preProcess(up_train_nn, method=c("scale"))

up_train_nn_matrix <-  as.matrix(sapply(up_train_nn, as.numeric))


modell_nn1 <- neuralnet(target ~., data = up_train_nn_matrix, hidden=c(10), linear.output = FALSE)

Das erste Neuronale Netz wurde mit der Library Neuralnet trainiert. Die Inputdaten wurden dabei nicht skaliert und es wurde alle Features aus dem Trainingsdatensatz verwendet, um die Targetvariable vorherzusagen. Dazu haben wir im ersten Versuch ein Neuronales Netz mit einem Hiddenlayer und 10 Neuronen verwendet.

plot(modell_nn1, rep = "best")

Nun wollen wir die Genauigkeit des Netzes auf den Testdaten errechnen.

data_test_nn_x <- data_test %>%
  select(-target)
data_test_nn_x <-  as.matrix(sapply(data_test_nn_x, as.numeric))

predict_testNN_1 = compute(modell_nn1, data_test_nn_x)
predict_testNN_1<-sapply(predict_testNN_1$net.result,round,digits=0)
nn_table1 <- table(data_test$target, predict_testNN_1)
results_nn1 <- data.frame(actual = data_test$target, prediction = predict_testNN_1)
#attach(results_nn1)
nn_table1
##    predict_testNN_1
##      1
##   0 94
##   1 11
library(nnet)
up_train_nn$target = class.ind(up_train_nn$target)
data_test_nn <- data_test
data_test_nn$target = class.ind(data_test_nn$target) 
data_test_nn_x <- data_test_nn %>%
  select(-target)
modell_nn2 <- nnet(target ~ ., data = up_train_nn, size = 2, rang = 0.1, maxit = 200, decay=5e-4, softmax = TRUE )
## # weights:  38
## initial  value 527.040057 
## iter  10 value 452.372850
## iter  20 value 391.383463
## iter  30 value 321.530598
## iter  40 value 305.369051
## iter  50 value 295.077802
## iter  60 value 293.577716
## iter  70 value 291.497938
## iter  80 value 276.942523
## iter  90 value 251.193745
## iter 100 value 239.385363
## iter 110 value 214.921572
## iter 120 value 206.363383
## iter 130 value 200.540603
## iter 140 value 195.243433
## iter 150 value 184.831926
## iter 160 value 178.861755
## iter 170 value 178.054813
## iter 180 value 177.615947
## iter 190 value 174.815629
## iter 200 value 173.358834
## final  value 173.358834 
## stopped after 200 iterations
#import the function from Github
library(devtools)
## Loading required package: usethis
source_url('https://gist.githubusercontent.com/fawda123/7471137/raw/466c1474d0a505ff044412703516c34f1a4684a5/nnet_plot_update.r')
## SHA-1 hash of file is 74c80bd5ddbc17ab3ae5ece9c0ed9beb612e87ef
plot.nnet(modell_nn2)
## Loading required package: scales
## Loading required package: reshape
## 
## Attaching package: 'reshape'
## The following object is masked from 'package:dplyr':
## 
##     rename

predict_testNN_2 <- predict(modell_nn2, data_test_nn_x)
predict_testNN_2<-sapply(predict_testNN_2,round,digits=0)
#table(data_test_nn$target[,2], predict_testNN_2[107:212])
conf_nn2 <-confusionMatrix(table(data_test_nn$target[,2], predict_testNN_2[105:209]))
conf_nn2
## Confusion Matrix and Statistics
## 
##    
##      0  1
##   0 72 22
##   1  6  5
##                                           
##                Accuracy : 0.7333          
##                  95% CI : (0.6381, 0.8149)
##     No Information Rate : 0.7429          
##     P-Value [Acc > NIR] : 0.637117        
##                                           
##                   Kappa : 0.1343          
##                                           
##  Mcnemar's Test P-Value : 0.004586        
##                                           
##             Sensitivity : 0.9231          
##             Specificity : 0.1852          
##          Pos Pred Value : 0.7660          
##          Neg Pred Value : 0.4545          
##              Prevalence : 0.7429          
##          Detection Rate : 0.6857          
##    Detection Prevalence : 0.8952          
##       Balanced Accuracy : 0.5541          
##                                           
##        'Positive' Class : 0               
## 

Dieses einfache Neuronale Netz ommt auf eine Accuracy von etwas mehr als 76 %, dies hat auf diesen Daten aber relativ wenig Aussagekraft. Die Specificity ist die Kennzahl, die uns auf unseren Daten am meisten interessiert. Dabei kommt das Modell nur auf 23 %.

acc_nn2 <- conf_nn2$overall[1]
sens_nn2 <- conf_nn2$byClass[1]
spec_nn2 <- conf_nn2$byClass[2]

Skalieren und One Hot Encoden der Daten:

Nun trainieren wir dasselbe Neuronale Netz nur mit vorverarbeiteten Daten, das heißt wir Dummy Encoden die Faktorvariablen und wir normalisieren die numerischen Daten. Danach werden wir prüfen, ob sich dadurch das Modell signifikant verbessern konnte.

glimpse(up_train_nn)
## Rows: 760
## Columns: 16
## $ Patient.age.quantile                                  <dbl> 17, 1, 9, 11, 1…
## $ target                                                <dbl[,2]> <matrix[26 …
## $ Patient.addmited.to.regular.ward..1.yes..0.no.        <fct> 0, 0, 0, 0, 0, …
## $ Patient.addmited.to.semi.intensive.unit..1.yes..0.no. <fct> 0, 1, 0, 0, 0, …
## $ Patient.addmited.to.intensive.care.unit..1.yes..0.no. <fct> 0, 0, 0, 0, 0, …
## $ sickness                                              <fct> 1, 0, 1, 1, 1, …
## $ Hematocrit                                            <dbl> 0.23651545, -1.…
## $ Platelets                                             <dbl> -0.51741302, 1.…
## $ Mean.platelet.volume                                  <dbl> 0.01067657, -1.…
## $ Lymphocytes                                           <dbl> 0.318365753, -0…
## $ Mean.corpuscular.hemoglobin.concentration..MCHC.      <dbl> -0.95079035, 3.…
## $ Leukocytes                                            <dbl> -9.461035e-02, …
## $ Basophils                                             <dbl> -0.22376651, -0…
## $ Mean.corpuscular.hemoglobin..MCH.                     <dbl> -0.29226932, 0.…
## $ Eosinophils                                           <dbl> 1.48215818, 1.0…
## $ Monocytes                                             <dbl> 0.35754666, 0.0…
library(ade4)
library(data.table)
## 
## Attaching package: 'data.table'
## The following object is masked from 'package:reshape':
## 
##     melt
## The following objects are masked from 'package:dplyr':
## 
##     between, first, last
ohe_feats = c( 'Patient.addmited.to.regular.ward..1.yes..0.no.', 'Patient.addmited.to.semi.intensive.unit..1.yes..0.no.', 'Patient.addmited.to.intensive.care.unit..1.yes..0.no.', "sickness")
for (f in ohe_feats){
  df_all_dummy = acm.disjonctif(up_train_nn[f])
  up_train_nn[f] = NULL
  up_train_nn = cbind(up_train_nn, df_all_dummy)
}
ohe_feats = c('Patient.addmited.to.regular.ward..1.yes..0.no.', 'Patient.addmited.to.semi.intensive.unit..1.yes..0.no.', 'Patient.addmited.to.intensive.care.unit..1.yes..0.no.', "sickness")

for (f in ohe_feats){
  df_all_dummy = acm.disjonctif(data_test_nn[f])
  data_test_nn[f] = NULL
  data_test_nn = cbind(data_test_nn, df_all_dummy)
}
preProcValues <- preProcess(up_train_nn, method = c("center", "scale"))

up_train_nn_transformed <- predict(preProcValues, up_train_nn)
data_test_nn_transformed <- predict(preProcValues, data_test_nn)

data_test_nn_transformed_x <- data_test_nn_transformed %>%
  select(-target)
modell_nn3 <- nnet(target ~ ., data = up_train_nn_transformed, size = 2, rang = 0.1, maxit = 200, decay=5e-4, softmax = TRUE )
## # weights:  46
## initial  value 527.163385 
## iter  10 value 334.507677
## iter  20 value 278.597294
## iter  30 value 220.360043
## iter  40 value 198.271403
## iter  50 value 179.789451
## iter  60 value 174.287902
## iter  70 value 173.924387
## iter  80 value 173.679228
## iter  90 value 173.359790
## iter 100 value 173.063926
## iter 110 value 172.927924
## iter 120 value 172.916586
## iter 130 value 172.909744
## iter 140 value 172.880737
## iter 150 value 172.861204
## iter 160 value 172.860771
## iter 170 value 172.860499
## final  value 172.860495 
## converged
plot.nnet(modell_nn3)

predict_testNN_3 <- predict(modell_nn3, data_test_nn_transformed_x)[,2]
predict_testNN_3<-sapply(predict_testNN_3,round,digits=0)
results_nn3 <- data.frame(actual = data_test_nn$target, prediction = predict_testNN_3)
#attach(results_nn3)

table(results_nn3$actual.1,results_nn3$prediction)
##    
##      0  1
##   0 81 13
##   1  3  8
conf_nn3 <- confusionMatrix(table(results_nn3$actual.1,results_nn3$prediction))
conf_nn3
## Confusion Matrix and Statistics
## 
##    
##      0  1
##   0 81 13
##   1  3  8
##                                           
##                Accuracy : 0.8476          
##                  95% CI : (0.7644, 0.9103)
##     No Information Rate : 0.8             
##     P-Value [Acc > NIR] : 0.13470         
##                                           
##                   Kappa : 0.4203          
##                                           
##  Mcnemar's Test P-Value : 0.02445         
##                                           
##             Sensitivity : 0.9643          
##             Specificity : 0.3810          
##          Pos Pred Value : 0.8617          
##          Neg Pred Value : 0.7273          
##              Prevalence : 0.8000          
##          Detection Rate : 0.7714          
##    Detection Prevalence : 0.8952          
##       Balanced Accuracy : 0.6726          
##                                           
##        'Positive' Class : 0               
## 

Das Modell mit den skalierten und encodeten Inputdaten schneidet doch deutlich besser ab. Die Specificity kommt auf 38%, was zwar immer noch nicht gut ist, aber immerhin schon einmal eine deutliche Verbesserung zum vorherigen Modell. Dieses Modell hat 3 Patienten fälschlicherweise als gesund ausgegeben, obwohl der Patient mit Corona infiziert ist.

acc_nn3 <- conf_nn3$overall[1]
sens_nn3 <- conf_nn3$byClass[1]
spec_nn3 <- conf_nn3$byClass[2]

Nun trainieren wir noch ein Neuronale Netz Modell mit der Library Caret. Dazu werden wir die Inputdaten auch preprocessen und eine 10 Fold Cross Validation anwenden, die wir 3 mal wiederholen. Wir verwenden hier erstmal den “normalen” Trainingsdatensatz und nicht den upgesampleten.

#Caret Modell
TrainingParameters_nn <- trainControl(method = "repeatedcv", number = 10, repeats=3)
modell_nn4 <- train(data_train[,-2], data_train$target,
                  method = "nnet",
                  trControl= TrainingParameters_nn,
                  preProcess=c("scale","center"),
                  na.action = na.omit
)
## # weights:  19
## initial  value 185.408160 
## iter  10 value 84.290613
## iter  20 value 66.778235
## iter  30 value 61.399709
## iter  40 value 59.552666
## iter  50 value 55.115770
## final  value 54.992022 
## converged
## # weights:  55
## initial  value 190.705179 
## iter  10 value 55.844046
## iter  20 value 45.070703
## iter  30 value 42.208468
## iter  40 value 38.894742
## iter  50 value 37.847652
## iter  60 value 36.956438
## iter  70 value 36.939003
## iter  80 value 36.935532
## iter  90 value 36.935020
## final  value 36.935018 
## converged
## # weights:  91
## initial  value 330.173709 
## iter  10 value 61.490610
## iter  20 value 38.691226
## iter  30 value 22.728135
## iter  40 value 17.656707
## iter  50 value 16.299636
## iter  60 value 15.758677
## iter  70 value 15.597162
## iter  80 value 15.490489
## iter  90 value 15.401558
## iter 100 value 15.319556
## final  value 15.319556 
## stopped after 100 iterations
## # weights:  19
## initial  value 222.405721 
## iter  10 value 74.314836
## iter  20 value 67.279782
## iter  30 value 66.752609
## final  value 66.649516 
## converged
## # weights:  55
## initial  value 241.343168 
## iter  10 value 74.032693
## iter  20 value 62.448293
## iter  30 value 57.093623
## iter  40 value 56.339582
## iter  50 value 55.786909
## iter  60 value 55.417339
## iter  70 value 55.320527
## iter  80 value 55.314192
## final  value 55.314191 
## converged
## # weights:  91
## initial  value 286.627491 
## iter  10 value 66.202031
## iter  20 value 55.115106
## iter  30 value 51.251889
## iter  40 value 49.250593
## iter  50 value 48.372418
## iter  60 value 48.118049
## iter  70 value 48.077961
## iter  80 value 48.065241
## iter  90 value 48.063338
## final  value 48.063326 
## converged
## # weights:  19
## initial  value 191.540826 
## iter  10 value 73.093334
## iter  20 value 57.104328
## iter  30 value 51.552355
## iter  40 value 50.282335
## iter  50 value 50.210799
## iter  60 value 50.173916
## iter  70 value 50.166243
## iter  80 value 50.162650
## iter  90 value 50.161904
## iter 100 value 50.160884
## final  value 50.160884 
## stopped after 100 iterations
## # weights:  55
## initial  value 382.035585 
## iter  10 value 61.164785
## iter  20 value 47.019382
## iter  30 value 38.227861
## iter  40 value 33.552050
## iter  50 value 26.059658
## iter  60 value 25.128027
## iter  70 value 24.335718
## iter  80 value 24.190858
## iter  90 value 24.093928
## iter 100 value 24.002794
## final  value 24.002794 
## stopped after 100 iterations
## # weights:  91
## initial  value 206.683352 
## iter  10 value 62.308485
## iter  20 value 34.731843
## iter  30 value 18.154430
## iter  40 value 13.683318
## iter  50 value 13.222837
## iter  60 value 13.044795
## iter  70 value 12.956077
## iter  80 value 12.870555
## iter  90 value 12.804837
## iter 100 value 12.746638
## final  value 12.746638 
## stopped after 100 iterations
## # weights:  19
## initial  value 286.277621 
## iter  10 value 89.527597
## iter  20 value 70.726338
## iter  30 value 63.210512
## iter  40 value 56.224796
## iter  50 value 55.692410
## iter  60 value 55.228112
## iter  70 value 54.228625
## iter  80 value 54.224239
## iter  90 value 54.221442
## iter 100 value 54.220771
## final  value 54.220771 
## stopped after 100 iterations
## # weights:  55
## initial  value 224.875887 
## iter  10 value 48.945961
## iter  20 value 29.915020
## iter  30 value 24.959051
## iter  40 value 23.431417
## iter  50 value 23.070004
## iter  60 value 23.058013
## iter  70 value 23.057749
## final  value 23.057614 
## converged
## # weights:  91
## initial  value 239.797675 
## iter  10 value 73.742683
## iter  20 value 35.225347
## iter  30 value 24.137757
## iter  40 value 22.360214
## iter  50 value 21.796791
## iter  60 value 20.955173
## iter  70 value 19.240365
## iter  80 value 17.411884
## iter  90 value 16.403695
## iter 100 value 15.727525
## final  value 15.727525 
## stopped after 100 iterations
## # weights:  19
## initial  value 282.533332 
## iter  10 value 82.056542
## iter  20 value 64.084308
## iter  30 value 62.165219
## final  value 62.032149 
## converged
## # weights:  55
## initial  value 198.200463 
## iter  10 value 67.293302
## iter  20 value 58.699726
## iter  30 value 53.747390
## iter  40 value 51.819863
## iter  50 value 51.331432
## iter  60 value 51.283131
## iter  70 value 51.274482
## final  value 51.274459 
## converged
## # weights:  91
## initial  value 205.820665 
## iter  10 value 64.517671
## iter  20 value 51.252143
## iter  30 value 49.971449
## iter  40 value 49.631921
## iter  50 value 49.339827
## iter  60 value 48.196932
## iter  70 value 47.492840
## iter  80 value 47.202777
## iter  90 value 46.990391
## iter 100 value 46.865842
## final  value 46.865842 
## stopped after 100 iterations
## # weights:  19
## initial  value 183.074496 
## iter  10 value 67.391766
## iter  20 value 51.370412
## iter  30 value 46.108455
## iter  40 value 45.294900
## iter  50 value 45.271980
## iter  60 value 45.248163
## iter  70 value 45.243476
## iter  80 value 45.242443
## iter  90 value 45.242072
## iter 100 value 45.241955
## final  value 45.241955 
## stopped after 100 iterations
## # weights:  55
## initial  value 203.435995 
## iter  10 value 46.087275
## iter  20 value 33.179598
## iter  30 value 28.985725
## iter  40 value 28.506676
## iter  50 value 28.385981
## iter  60 value 27.490563
## iter  70 value 27.156316
## iter  80 value 27.075757
## iter  90 value 27.024519
## iter 100 value 26.979727
## final  value 26.979727 
## stopped after 100 iterations
## # weights:  91
## initial  value 220.032032 
## iter  10 value 52.488586
## iter  20 value 29.894010
## iter  30 value 19.538080
## iter  40 value 16.035090
## iter  50 value 15.076600
## iter  60 value 14.668283
## iter  70 value 14.180792
## iter  80 value 13.854090
## iter  90 value 13.702398
## iter 100 value 13.585733
## final  value 13.585733 
## stopped after 100 iterations
## # weights:  19
## initial  value 311.648093 
## iter  10 value 73.007434
## iter  20 value 63.024575
## iter  30 value 61.189028
## iter  40 value 59.559704
## iter  50 value 58.924662
## iter  60 value 57.782550
## iter  70 value 55.312979
## iter  80 value 55.145477
## iter  90 value 55.106892
## iter 100 value 55.090757
## final  value 55.090757 
## stopped after 100 iterations
## # weights:  55
## initial  value 384.137809 
## iter  10 value 63.771419
## iter  20 value 45.285093
## iter  30 value 36.843221
## iter  40 value 32.704915
## iter  50 value 31.122020
## iter  60 value 30.346372
## iter  70 value 29.335483
## iter  80 value 28.419185
## iter  90 value 27.667624
## iter 100 value 27.552865
## final  value 27.552865 
## stopped after 100 iterations
## # weights:  91
## initial  value 330.109239 
## iter  10 value 62.101787
## iter  20 value 37.195417
## iter  30 value 32.848913
## iter  40 value 30.607429
## iter  50 value 29.447729
## iter  60 value 28.374704
## iter  70 value 27.374718
## iter  80 value 25.870598
## iter  90 value 25.743837
## iter 100 value 25.716873
## final  value 25.716873 
## stopped after 100 iterations
## # weights:  19
## initial  value 306.238629 
## iter  10 value 81.693083
## iter  20 value 75.505622
## iter  30 value 70.577848
## iter  40 value 66.814156
## iter  50 value 66.790546
## final  value 66.789381 
## converged
## # weights:  55
## initial  value 204.817264 
## iter  10 value 73.928894
## iter  20 value 63.163005
## iter  30 value 55.972889
## iter  40 value 54.698212
## iter  50 value 54.178890
## iter  60 value 53.938935
## iter  70 value 53.085333
## iter  80 value 52.608627
## iter  90 value 52.562065
## final  value 52.561919 
## converged
## # weights:  91
## initial  value 430.976164 
## iter  10 value 68.605146
## iter  20 value 57.334478
## iter  30 value 53.611358
## iter  40 value 52.238105
## iter  50 value 51.513395
## iter  60 value 50.805763
## iter  70 value 50.574978
## iter  80 value 50.470421
## iter  90 value 50.359608
## iter 100 value 50.350961
## final  value 50.350961 
## stopped after 100 iterations
## # weights:  19
## initial  value 232.768030 
## iter  10 value 83.425363
## iter  20 value 61.401918
## iter  30 value 59.195269
## iter  40 value 58.120347
## iter  50 value 58.017405
## iter  60 value 57.972578
## iter  70 value 57.959407
## iter  80 value 57.958188
## iter  90 value 57.957476
## iter 100 value 57.956749
## final  value 57.956749 
## stopped after 100 iterations
## # weights:  55
## initial  value 349.967968 
## iter  10 value 71.072014
## iter  20 value 46.818952
## iter  30 value 37.009311
## iter  40 value 32.394619
## iter  50 value 31.447583
## iter  60 value 30.654918
## iter  70 value 30.194055
## iter  80 value 29.721778
## iter  90 value 29.490688
## iter 100 value 29.412036
## final  value 29.412036 
## stopped after 100 iterations
## # weights:  91
## initial  value 182.815797 
## iter  10 value 63.954774
## iter  20 value 42.763751
## iter  30 value 38.271554
## iter  40 value 36.647291
## iter  50 value 34.901824
## iter  60 value 34.440258
## iter  70 value 34.146933
## iter  80 value 33.853376
## iter  90 value 33.660804
## iter 100 value 33.498307
## final  value 33.498307 
## stopped after 100 iterations
## # weights:  19
## initial  value 387.605790 
## iter  10 value 69.913322
## iter  20 value 54.545131
## iter  30 value 48.482722
## iter  40 value 46.644491
## iter  50 value 45.872750
## iter  60 value 45.652011
## iter  70 value 44.629535
## iter  80 value 43.421931
## iter  90 value 43.294435
## iter 100 value 43.150734
## final  value 43.150734 
## stopped after 100 iterations
## # weights:  55
## initial  value 579.606959 
## iter  10 value 56.457016
## iter  20 value 42.327936
## iter  30 value 38.169519
## iter  40 value 35.910577
## iter  50 value 35.530869
## iter  60 value 35.519710
## iter  70 value 35.519379
## final  value 35.519378 
## converged
## # weights:  91
## initial  value 537.179777 
## iter  10 value 47.722478
## iter  20 value 24.277198
## iter  30 value 16.904076
## iter  40 value 15.021183
## iter  50 value 14.585503
## iter  60 value 14.291731
## iter  70 value 14.234319
## iter  80 value 14.227045
## iter  90 value 14.186040
## iter 100 value 14.183905
## final  value 14.183905 
## stopped after 100 iterations
## # weights:  19
## initial  value 303.157754 
## iter  10 value 74.107947
## iter  20 value 59.409695
## iter  30 value 57.125317
## iter  40 value 57.076137
## final  value 57.076136 
## converged
## # weights:  55
## initial  value 330.815333 
## iter  10 value 70.756526
## iter  20 value 56.840547
## iter  30 value 51.877600
## iter  40 value 49.005727
## iter  50 value 48.258565
## iter  60 value 48.212065
## iter  70 value 48.206692
## final  value 48.206599 
## converged
## # weights:  91
## initial  value 166.137564 
## iter  10 value 55.267976
## iter  20 value 47.329214
## iter  30 value 42.350944
## iter  40 value 41.518976
## iter  50 value 41.347030
## iter  60 value 41.307753
## iter  70 value 41.264149
## iter  80 value 41.256458
## final  value 41.256454 
## converged
## # weights:  19
## initial  value 354.503141 
## iter  10 value 70.777513
## iter  20 value 54.093499
## iter  30 value 48.309698
## iter  40 value 47.419172
## iter  50 value 47.155313
## iter  60 value 46.782899
## iter  70 value 45.975276
## iter  80 value 45.718713
## iter  90 value 45.682438
## iter 100 value 45.664932
## final  value 45.664932 
## stopped after 100 iterations
## # weights:  55
## initial  value 270.436308 
## iter  10 value 56.197950
## iter  20 value 40.853311
## iter  30 value 35.808824
## iter  40 value 30.318804
## iter  50 value 28.043337
## iter  60 value 27.403397
## iter  70 value 27.315056
## iter  80 value 27.281087
## iter  90 value 27.253650
## iter 100 value 27.223381
## final  value 27.223381 
## stopped after 100 iterations
## # weights:  91
## initial  value 549.781329 
## iter  10 value 56.572117
## iter  20 value 22.968796
## iter  30 value 14.973051
## iter  40 value 9.394731
## iter  50 value 8.818811
## iter  60 value 8.654159
## iter  70 value 8.530771
## iter  80 value 8.449709
## iter  90 value 8.384559
## iter 100 value 8.331458
## final  value 8.331458 
## stopped after 100 iterations
## # weights:  19
## initial  value 345.040686 
## iter  10 value 75.919085
## iter  20 value 66.020282
## iter  30 value 60.273975
## iter  40 value 57.654942
## iter  50 value 55.763388
## iter  60 value 50.449853
## iter  70 value 48.026831
## iter  80 value 47.149169
## iter  90 value 47.130671
## iter 100 value 47.125554
## final  value 47.125554 
## stopped after 100 iterations
## # weights:  55
## initial  value 374.754083 
## iter  10 value 62.482973
## iter  20 value 44.339400
## iter  30 value 39.607140
## iter  40 value 34.729597
## iter  50 value 32.907114
## iter  60 value 30.425710
## iter  70 value 29.617660
## iter  80 value 29.150846
## iter  90 value 28.835879
## iter 100 value 28.664701
## final  value 28.664701 
## stopped after 100 iterations
## # weights:  91
## initial  value 306.525480 
## iter  10 value 71.309038
## iter  20 value 40.591867
## iter  30 value 31.650674
## iter  40 value 30.155208
## iter  50 value 28.738762
## iter  60 value 27.145220
## iter  70 value 24.747099
## iter  80 value 24.367005
## iter  90 value 23.428498
## iter 100 value 22.365982
## final  value 22.365982 
## stopped after 100 iterations
## # weights:  19
## initial  value 316.023728 
## iter  10 value 83.252680
## iter  20 value 67.945097
## iter  30 value 65.046944
## iter  40 value 64.926625
## iter  50 value 64.925987
## final  value 64.925961 
## converged
## # weights:  55
## initial  value 289.372860 
## iter  10 value 68.942698
## iter  20 value 58.598883
## iter  30 value 57.514146
## iter  40 value 56.556629
## iter  50 value 56.478624
## iter  60 value 56.477083
## final  value 56.477016 
## converged
## # weights:  91
## initial  value 448.146538 
## iter  10 value 68.092229
## iter  20 value 53.970296
## iter  30 value 50.009618
## iter  40 value 48.393135
## iter  50 value 48.154222
## iter  60 value 47.797771
## iter  70 value 47.744097
## iter  80 value 47.723362
## iter  90 value 47.715403
## final  value 47.715392 
## converged
## # weights:  19
## initial  value 243.053167 
## iter  10 value 76.654603
## iter  20 value 61.727120
## iter  30 value 60.079793
## iter  40 value 56.975127
## iter  50 value 55.121277
## iter  60 value 50.192606
## iter  70 value 48.534593
## iter  80 value 48.442359
## iter  90 value 48.411481
## iter 100 value 48.405476
## final  value 48.405476 
## stopped after 100 iterations
## # weights:  55
## initial  value 317.214494 
## iter  10 value 55.063601
## iter  20 value 41.137647
## iter  30 value 35.922664
## iter  40 value 34.102635
## iter  50 value 33.426190
## iter  60 value 33.285206
## iter  70 value 33.223503
## iter  80 value 33.181476
## iter  90 value 33.135805
## iter 100 value 33.109084
## final  value 33.109084 
## stopped after 100 iterations
## # weights:  91
## initial  value 261.807770 
## iter  10 value 51.401464
## iter  20 value 32.930436
## iter  30 value 22.808936
## iter  40 value 21.816832
## iter  50 value 21.314356
## iter  60 value 20.598306
## iter  70 value 18.921026
## iter  80 value 18.522722
## iter  90 value 18.255729
## iter 100 value 18.102885
## final  value 18.102885 
## stopped after 100 iterations
## # weights:  19
## initial  value 251.247557 
## iter  10 value 77.695291
## iter  20 value 61.167949
## iter  30 value 59.988937
## iter  40 value 56.825569
## iter  50 value 56.613582
## iter  60 value 56.569331
## iter  70 value 56.551488
## iter  80 value 56.531075
## iter  90 value 56.523204
## iter 100 value 56.516267
## final  value 56.516267 
## stopped after 100 iterations
## # weights:  55
## initial  value 203.579865 
## iter  10 value 57.996361
## iter  20 value 45.832417
## iter  30 value 41.337327
## iter  40 value 37.886366
## iter  50 value 36.263010
## iter  60 value 35.445199
## iter  70 value 35.358661
## iter  80 value 35.333661
## iter  90 value 35.288182
## iter 100 value 35.266811
## final  value 35.266811 
## stopped after 100 iterations
## # weights:  91
## initial  value 572.618555 
## iter  10 value 67.542340
## iter  20 value 48.388935
## iter  30 value 35.916410
## iter  40 value 29.574800
## iter  50 value 23.765106
## iter  60 value 21.199714
## iter  70 value 20.354820
## iter  80 value 20.101955
## iter  90 value 19.781746
## iter 100 value 19.008977
## final  value 19.008977 
## stopped after 100 iterations
## # weights:  19
## initial  value 342.621679 
## iter  10 value 69.260450
## iter  20 value 65.220521
## iter  30 value 65.186753
## final  value 65.186732 
## converged
## # weights:  55
## initial  value 165.757442 
## iter  10 value 74.292964
## iter  20 value 62.600371
## iter  30 value 56.797474
## iter  40 value 54.066879
## iter  50 value 53.508331
## iter  60 value 52.617546
## iter  70 value 50.747981
## iter  80 value 50.503616
## iter  90 value 50.499290
## final  value 50.499241 
## converged
## # weights:  91
## initial  value 432.666688 
## iter  10 value 76.598614
## iter  20 value 57.579174
## iter  30 value 52.727326
## iter  40 value 51.357709
## iter  50 value 50.024186
## iter  60 value 49.344433
## iter  70 value 49.091518
## iter  80 value 48.583675
## iter  90 value 48.069694
## iter 100 value 47.596017
## final  value 47.596017 
## stopped after 100 iterations
## # weights:  19
## initial  value 407.422633 
## iter  10 value 115.434875
## iter  20 value 91.229537
## iter  30 value 70.535249
## iter  40 value 64.348726
## iter  50 value 64.194124
## iter  60 value 63.595239
## iter  70 value 58.377638
## iter  80 value 56.836354
## iter  90 value 56.664404
## iter 100 value 56.663503
## final  value 56.663503 
## stopped after 100 iterations
## # weights:  55
## initial  value 145.466806 
## iter  10 value 58.818418
## iter  20 value 45.081669
## iter  30 value 38.049279
## iter  40 value 34.466625
## iter  50 value 33.548075
## iter  60 value 32.394313
## iter  70 value 29.240483
## iter  80 value 27.945885
## iter  90 value 27.700734
## iter 100 value 27.587855
## final  value 27.587855 
## stopped after 100 iterations
## # weights:  91
## initial  value 209.486057 
## iter  10 value 54.161432
## iter  20 value 39.264299
## iter  30 value 30.246482
## iter  40 value 28.575760
## iter  50 value 26.639428
## iter  60 value 23.034610
## iter  70 value 21.364055
## iter  80 value 20.333620
## iter  90 value 19.829913
## iter 100 value 19.581526
## final  value 19.581526 
## stopped after 100 iterations
## # weights:  19
## initial  value 218.240339 
## iter  10 value 90.012095
## iter  20 value 56.979914
## iter  30 value 49.936394
## iter  40 value 48.266276
## iter  50 value 47.684868
## iter  60 value 46.357085
## iter  70 value 41.386691
## iter  80 value 40.836878
## iter  90 value 40.836245
## iter 100 value 40.836090
## final  value 40.836090 
## stopped after 100 iterations
## # weights:  55
## initial  value 334.730064 
## iter  10 value 63.644153
## iter  20 value 40.873464
## iter  30 value 37.925274
## iter  40 value 37.150599
## iter  50 value 36.508759
## iter  60 value 35.885653
## iter  70 value 35.707969
## iter  80 value 35.507616
## iter  90 value 35.472238
## iter 100 value 35.432061
## final  value 35.432061 
## stopped after 100 iterations
## # weights:  91
## initial  value 267.436193 
## iter  10 value 79.099267
## iter  20 value 44.920796
## iter  30 value 32.130563
## iter  40 value 22.460950
## iter  50 value 19.259770
## iter  60 value 17.499333
## iter  70 value 16.661115
## iter  80 value 16.240835
## iter  90 value 15.803699
## iter 100 value 15.762513
## final  value 15.762513 
## stopped after 100 iterations
## # weights:  19
## initial  value 222.464673 
## iter  10 value 79.419118
## iter  20 value 64.062096
## iter  30 value 59.529674
## iter  40 value 59.083184
## iter  50 value 59.082664
## final  value 59.082633 
## converged
## # weights:  55
## initial  value 380.353183 
## iter  10 value 71.077013
## iter  20 value 54.850772
## iter  30 value 50.207083
## iter  40 value 49.406652
## iter  50 value 48.995026
## iter  60 value 48.869673
## iter  70 value 48.853687
## final  value 48.853636 
## converged
## # weights:  91
## initial  value 378.824836 
## iter  10 value 62.093280
## iter  20 value 50.364955
## iter  30 value 47.543600
## iter  40 value 47.020366
## iter  50 value 46.340365
## iter  60 value 45.803319
## iter  70 value 45.710542
## iter  80 value 45.489212
## iter  90 value 45.451910
## iter 100 value 45.445913
## final  value 45.445913 
## stopped after 100 iterations
## # weights:  19
## initial  value 396.020545 
## iter  10 value 122.762213
## iter  20 value 112.663964
## iter  30 value 104.384706
## iter  40 value 101.913275
## iter  50 value 97.443649
## iter  60 value 96.733547
## iter  70 value 96.160029
## iter  80 value 96.117528
## iter  90 value 96.097230
## iter 100 value 96.089834
## final  value 96.089834 
## stopped after 100 iterations
## # weights:  55
## initial  value 288.584718 
## iter  10 value 43.631277
## iter  20 value 29.302224
## iter  30 value 26.259462
## iter  40 value 25.416817
## iter  50 value 24.456399
## iter  60 value 24.097394
## iter  70 value 23.826870
## iter  80 value 23.690644
## iter  90 value 23.543142
## iter 100 value 23.431808
## final  value 23.431808 
## stopped after 100 iterations
## # weights:  91
## initial  value 439.276550 
## iter  10 value 54.110665
## iter  20 value 26.332024
## iter  30 value 22.773461
## iter  40 value 22.081642
## iter  50 value 21.980790
## iter  60 value 21.096802
## iter  70 value 20.941940
## iter  80 value 20.672042
## iter  90 value 20.438691
## iter 100 value 20.189340
## final  value 20.189340 
## stopped after 100 iterations
## # weights:  19
## initial  value 168.343722 
## iter  10 value 86.191820
## iter  20 value 72.433935
## iter  30 value 66.571315
## iter  40 value 65.840724
## iter  50 value 64.377752
## iter  60 value 63.362230
## iter  70 value 63.333758
## final  value 63.333082 
## converged
## # weights:  55
## initial  value 261.115398 
## iter  10 value 48.537409
## iter  20 value 31.715875
## iter  30 value 22.539976
## iter  40 value 19.812343
## iter  50 value 18.108177
## iter  60 value 17.166374
## iter  70 value 16.795611
## iter  80 value 16.478567
## iter  90 value 16.184464
## iter 100 value 16.038192
## final  value 16.038192 
## stopped after 100 iterations
## # weights:  91
## initial  value 145.504672 
## iter  10 value 58.771264
## iter  20 value 38.204477
## iter  30 value 31.073672
## iter  40 value 25.055679
## iter  50 value 23.423480
## iter  60 value 22.502783
## iter  70 value 21.071683
## iter  80 value 19.931631
## iter  90 value 18.646069
## iter 100 value 17.690261
## final  value 17.690261 
## stopped after 100 iterations
## # weights:  19
## initial  value 232.894286 
## iter  10 value 76.706554
## iter  20 value 63.325103
## iter  30 value 61.460308
## iter  40 value 60.980898
## final  value 60.980873 
## converged
## # weights:  55
## initial  value 214.229567 
## iter  10 value 60.623066
## iter  20 value 52.465614
## iter  30 value 48.521159
## iter  40 value 47.823761
## iter  50 value 47.359808
## iter  60 value 46.810301
## iter  70 value 45.575723
## iter  80 value 44.865415
## iter  90 value 44.780282
## iter 100 value 44.776703
## final  value 44.776703 
## stopped after 100 iterations
## # weights:  91
## initial  value 356.064199 
## iter  10 value 61.863453
## iter  20 value 47.107724
## iter  30 value 43.803595
## iter  40 value 42.544600
## iter  50 value 42.116174
## iter  60 value 41.887113
## iter  70 value 41.866120
## iter  80 value 41.865336
## iter  90 value 41.865189
## final  value 41.865174 
## converged
## # weights:  19
## initial  value 213.347118 
## iter  10 value 65.167323
## iter  20 value 55.086731
## iter  30 value 51.614278
## iter  40 value 50.364957
## iter  50 value 46.198577
## iter  60 value 45.854288
## iter  70 value 45.778379
## iter  80 value 45.745392
## iter  90 value 45.726474
## iter 100 value 45.724829
## final  value 45.724829 
## stopped after 100 iterations
## # weights:  55
## initial  value 329.215988 
## iter  10 value 57.038067
## iter  20 value 43.392284
## iter  30 value 35.063588
## iter  40 value 24.963449
## iter  50 value 23.233849
## iter  60 value 21.139436
## iter  70 value 20.665133
## iter  80 value 20.561580
## iter  90 value 20.481208
## iter 100 value 20.105253
## final  value 20.105253 
## stopped after 100 iterations
## # weights:  91
## initial  value 367.284884 
## iter  10 value 48.819424
## iter  20 value 25.166973
## iter  30 value 17.776750
## iter  40 value 12.924968
## iter  50 value 11.763557
## iter  60 value 11.508601
## iter  70 value 11.280959
## iter  80 value 11.074781
## iter  90 value 10.939780
## iter 100 value 10.856223
## final  value 10.856223 
## stopped after 100 iterations
## # weights:  19
## initial  value 259.336593 
## iter  10 value 89.993594
## iter  20 value 59.966595
## iter  30 value 56.468876
## iter  40 value 55.095338
## iter  50 value 54.720630
## iter  60 value 54.671383
## iter  70 value 54.651243
## iter  80 value 54.598438
## iter  90 value 54.537963
## iter 100 value 54.510176
## final  value 54.510176 
## stopped after 100 iterations
## # weights:  55
## initial  value 249.423344 
## iter  10 value 58.437503
## iter  20 value 37.349825
## iter  30 value 28.003306
## iter  40 value 26.164021
## iter  50 value 25.770696
## iter  60 value 25.441260
## iter  70 value 25.418247
## iter  80 value 25.415719
## iter  90 value 25.415102
## iter 100 value 25.414853
## final  value 25.414853 
## stopped after 100 iterations
## # weights:  91
## initial  value 313.849454 
## iter  10 value 44.773693
## iter  20 value 22.543577
## iter  30 value 14.970099
## iter  40 value 12.147412
## iter  50 value 11.195229
## iter  60 value 11.103868
## iter  70 value 11.004832
## iter  80 value 10.997777
## iter  90 value 10.972458
## iter 100 value 10.971293
## final  value 10.971293 
## stopped after 100 iterations
## # weights:  19
## initial  value 264.071300 
## iter  10 value 80.045625
## iter  20 value 64.615234
## iter  30 value 63.061080
## iter  40 value 63.019783
## iter  50 value 63.019739
## iter  50 value 63.019738
## final  value 63.019737 
## converged
## # weights:  55
## initial  value 307.415364 
## iter  10 value 69.566024
## iter  20 value 58.269621
## iter  30 value 53.381802
## iter  40 value 52.436233
## iter  50 value 52.182827
## iter  60 value 52.070377
## iter  70 value 52.047902
## final  value 52.047681 
## converged
## # weights:  91
## initial  value 517.461725 
## iter  10 value 76.731564
## iter  20 value 57.799064
## iter  30 value 50.823176
## iter  40 value 48.250555
## iter  50 value 46.289620
## iter  60 value 46.023501
## iter  70 value 45.879392
## iter  80 value 45.841585
## iter  90 value 45.838706
## final  value 45.838689 
## converged
## # weights:  19
## initial  value 237.710549 
## iter  10 value 82.892387
## iter  20 value 61.029888
## iter  30 value 57.711586
## iter  40 value 56.342969
## iter  50 value 53.627385
## iter  60 value 53.500798
## iter  70 value 53.491731
## iter  80 value 53.359836
## iter  90 value 53.271820
## iter 100 value 53.267433
## final  value 53.267433 
## stopped after 100 iterations
## # weights:  55
## initial  value 335.440764 
## iter  10 value 52.370545
## iter  20 value 34.251138
## iter  30 value 28.471382
## iter  40 value 27.970566
## iter  50 value 26.904853
## iter  60 value 25.687843
## iter  70 value 25.162390
## iter  80 value 25.041898
## iter  90 value 24.768853
## iter 100 value 24.683696
## final  value 24.683696 
## stopped after 100 iterations
## # weights:  91
## initial  value 263.637046 
## iter  10 value 55.936004
## iter  20 value 32.952895
## iter  30 value 21.802571
## iter  40 value 18.708880
## iter  50 value 17.512869
## iter  60 value 17.226167
## iter  70 value 17.075609
## iter  80 value 16.980160
## iter  90 value 16.902767
## iter 100 value 16.437062
## final  value 16.437062 
## stopped after 100 iterations
## # weights:  19
## initial  value 309.990241 
## iter  10 value 104.193224
## iter  20 value 78.965869
## iter  30 value 72.698058
## iter  40 value 69.076222
## iter  50 value 69.016690
## iter  60 value 68.988122
## iter  70 value 68.981791
## iter  80 value 68.976439
## iter  90 value 68.972807
## iter 100 value 68.968851
## final  value 68.968851 
## stopped after 100 iterations
## # weights:  55
## initial  value 339.924016 
## iter  10 value 63.079124
## iter  20 value 50.490019
## iter  30 value 48.391569
## iter  40 value 47.851349
## iter  50 value 46.004926
## iter  60 value 44.364665
## iter  70 value 42.950680
## iter  80 value 41.005478
## iter  90 value 40.322223
## iter 100 value 39.876888
## final  value 39.876888 
## stopped after 100 iterations
## # weights:  91
## initial  value 170.199924 
## iter  10 value 50.876607
## iter  20 value 27.657978
## iter  30 value 23.557524
## iter  40 value 21.173226
## iter  50 value 15.599627
## iter  60 value 15.085160
## iter  70 value 14.998271
## iter  80 value 14.987114
## iter  90 value 14.985827
## iter 100 value 14.985430
## final  value 14.985430 
## stopped after 100 iterations
## # weights:  19
## initial  value 362.821151 
## iter  10 value 73.365288
## iter  20 value 62.682115
## iter  30 value 60.950347
## final  value 60.683703 
## converged
## # weights:  55
## initial  value 227.986754 
## iter  10 value 62.373513
## iter  20 value 55.346887
## iter  30 value 52.503163
## iter  40 value 52.021338
## iter  50 value 51.704675
## iter  60 value 51.596328
## iter  70 value 51.593144
## final  value 51.593133 
## converged
## # weights:  91
## initial  value 354.674027 
## iter  10 value 66.790678
## iter  20 value 51.194326
## iter  30 value 48.508613
## iter  40 value 46.196237
## iter  50 value 45.312217
## iter  60 value 45.098840
## iter  70 value 45.027278
## iter  80 value 45.003864
## iter  90 value 45.002977
## final  value 45.002954 
## converged
## # weights:  19
## initial  value 194.917489 
## iter  10 value 68.990723
## iter  20 value 61.359162
## iter  30 value 54.976531
## iter  40 value 53.420243
## iter  50 value 53.271813
## iter  60 value 52.775717
## iter  70 value 52.388151
## iter  80 value 52.242768
## iter  90 value 52.225419
## iter 100 value 52.224259
## final  value 52.224259 
## stopped after 100 iterations
## # weights:  55
## initial  value 236.567182 
## iter  10 value 62.195800
## iter  20 value 45.427642
## iter  30 value 42.080617
## iter  40 value 40.318934
## iter  50 value 39.790329
## iter  60 value 39.746717
## iter  70 value 39.705940
## iter  80 value 39.572616
## iter  90 value 39.456814
## iter 100 value 39.421245
## final  value 39.421245 
## stopped after 100 iterations
## # weights:  91
## initial  value 174.723180 
## iter  10 value 49.715253
## iter  20 value 34.836065
## iter  30 value 19.878326
## iter  40 value 18.398738
## iter  50 value 16.897016
## iter  60 value 16.160773
## iter  70 value 14.569024
## iter  80 value 13.885074
## iter  90 value 13.576471
## iter 100 value 13.274576
## final  value 13.274576 
## stopped after 100 iterations
## # weights:  19
## initial  value 252.090136 
## iter  10 value 82.280792
## iter  20 value 66.825436
## iter  30 value 58.114790
## iter  40 value 56.763178
## iter  50 value 55.847320
## iter  60 value 55.588206
## iter  70 value 54.216535
## iter  80 value 53.574667
## iter  90 value 53.542540
## iter 100 value 53.541960
## final  value 53.541960 
## stopped after 100 iterations
## # weights:  55
## initial  value 182.366944 
## iter  10 value 62.633743
## iter  20 value 47.973995
## iter  30 value 41.718986
## iter  40 value 38.285817
## iter  50 value 37.612332
## iter  60 value 36.591622
## iter  70 value 35.489577
## iter  80 value 34.252665
## iter  90 value 34.223836
## iter 100 value 34.215052
## final  value 34.215052 
## stopped after 100 iterations
## # weights:  91
## initial  value 462.808300 
## iter  10 value 52.058852
## iter  20 value 30.849162
## iter  30 value 24.358450
## iter  40 value 22.216961
## iter  50 value 21.353490
## iter  60 value 21.012662
## iter  70 value 20.979145
## iter  80 value 20.971660
## iter  90 value 20.959464
## iter 100 value 20.950846
## final  value 20.950846 
## stopped after 100 iterations
## # weights:  19
## initial  value 372.316471 
## iter  10 value 73.211804
## iter  20 value 63.315067
## iter  30 value 62.442832
## final  value 62.421381 
## converged
## # weights:  55
## initial  value 283.613796 
## iter  10 value 64.551751
## iter  20 value 54.143111
## iter  30 value 52.108327
## iter  40 value 51.897794
## iter  50 value 51.855283
## iter  60 value 51.842803
## final  value 51.842755 
## converged
## # weights:  91
## initial  value 334.609157 
## iter  10 value 67.093976
## iter  20 value 51.786965
## iter  30 value 47.902467
## iter  40 value 47.151411
## iter  50 value 46.680702
## iter  60 value 46.484701
## iter  70 value 46.372326
## iter  80 value 46.347716
## iter  90 value 46.340328
## iter 100 value 46.337897
## final  value 46.337897 
## stopped after 100 iterations
## # weights:  19
## initial  value 234.829767 
## iter  10 value 73.014315
## iter  20 value 58.741330
## iter  30 value 56.247097
## iter  40 value 53.956790
## iter  50 value 53.656621
## iter  60 value 53.527428
## iter  70 value 53.507692
## iter  80 value 53.423070
## iter  90 value 53.421220
## iter 100 value 53.419734
## final  value 53.419734 
## stopped after 100 iterations
## # weights:  55
## initial  value 261.522122 
## iter  10 value 51.569936
## iter  20 value 40.113281
## iter  30 value 36.440547
## iter  40 value 33.357340
## iter  50 value 32.576208
## iter  60 value 32.315461
## iter  70 value 32.119885
## iter  80 value 32.059350
## iter  90 value 31.998957
## iter 100 value 31.968842
## final  value 31.968842 
## stopped after 100 iterations
## # weights:  91
## initial  value 281.261416 
## iter  10 value 60.817088
## iter  20 value 43.186530
## iter  30 value 32.541539
## iter  40 value 26.574356
## iter  50 value 24.160475
## iter  60 value 21.684184
## iter  70 value 20.395683
## iter  80 value 19.235389
## iter  90 value 18.969359
## iter 100 value 16.176906
## final  value 16.176906 
## stopped after 100 iterations
## # weights:  19
## initial  value 207.394153 
## iter  10 value 70.643302
## iter  20 value 59.681662
## iter  30 value 57.189521
## iter  40 value 52.219988
## iter  50 value 51.290159
## iter  60 value 51.266533
## iter  70 value 51.257408
## iter  80 value 51.255254
## iter  90 value 51.254050
## iter 100 value 51.253339
## final  value 51.253339 
## stopped after 100 iterations
## # weights:  55
## initial  value 236.801399 
## iter  10 value 61.379607
## iter  20 value 45.628953
## iter  30 value 42.068830
## iter  40 value 35.930905
## iter  50 value 34.052711
## iter  60 value 33.103166
## iter  70 value 32.869517
## iter  80 value 32.662327
## iter  90 value 32.361833
## iter 100 value 32.146702
## final  value 32.146702 
## stopped after 100 iterations
## # weights:  91
## initial  value 232.870679 
## iter  10 value 58.150468
## iter  20 value 36.402949
## iter  30 value 28.831657
## iter  40 value 26.345117
## iter  50 value 24.841441
## iter  60 value 24.266290
## iter  70 value 23.934294
## iter  80 value 23.881019
## iter  90 value 23.861180
## iter 100 value 23.853326
## final  value 23.853326 
## stopped after 100 iterations
## # weights:  19
## initial  value 196.435268 
## iter  10 value 71.822345
## iter  20 value 64.777617
## iter  30 value 64.135420
## final  value 64.117570 
## converged
## # weights:  55
## initial  value 242.977605 
## iter  10 value 68.106046
## iter  20 value 57.338613
## iter  30 value 53.363406
## iter  40 value 52.347199
## iter  50 value 51.814253
## iter  60 value 51.065032
## iter  70 value 50.124369
## iter  80 value 49.439191
## iter  90 value 49.372190
## iter 100 value 49.371466
## final  value 49.371466 
## stopped after 100 iterations
## # weights:  91
## initial  value 177.792951 
## iter  10 value 63.695869
## iter  20 value 54.946237
## iter  30 value 51.105713
## iter  40 value 50.052076
## iter  50 value 49.699651
## iter  60 value 49.573702
## iter  70 value 49.546419
## iter  80 value 49.541873
## iter  90 value 49.541570
## final  value 49.541552 
## converged
## # weights:  19
## initial  value 217.618502 
## iter  10 value 78.254102
## iter  20 value 60.114580
## iter  30 value 57.557933
## iter  40 value 55.772822
## iter  50 value 55.347920
## iter  60 value 55.169224
## iter  70 value 54.878204
## iter  80 value 54.783416
## iter  90 value 54.737038
## iter 100 value 54.587009
## final  value 54.587009 
## stopped after 100 iterations
## # weights:  55
## initial  value 183.784319 
## iter  10 value 64.731490
## iter  20 value 48.728212
## iter  30 value 42.660645
## iter  40 value 38.180583
## iter  50 value 35.928147
## iter  60 value 35.119640
## iter  70 value 34.848779
## iter  80 value 34.594598
## iter  90 value 34.457227
## iter 100 value 33.895486
## final  value 33.895486 
## stopped after 100 iterations
## # weights:  91
## initial  value 378.786346 
## iter  10 value 52.419706
## iter  20 value 29.883161
## iter  30 value 21.477103
## iter  40 value 19.100107
## iter  50 value 18.623419
## iter  60 value 18.334134
## iter  70 value 18.054271
## iter  80 value 17.835112
## iter  90 value 17.757448
## iter 100 value 17.671371
## final  value 17.671371 
## stopped after 100 iterations
## # weights:  19
## initial  value 202.467755 
## iter  10 value 74.465328
## iter  20 value 59.277939
## iter  30 value 58.045300
## iter  40 value 55.068215
## iter  50 value 54.049901
## iter  60 value 53.708362
## iter  70 value 53.483469
## iter  80 value 53.369995
## iter  90 value 53.344451
## iter 100 value 53.323855
## final  value 53.323855 
## stopped after 100 iterations
## # weights:  55
## initial  value 196.772788 
## iter  10 value 65.293154
## iter  20 value 53.482117
## iter  30 value 46.111869
## iter  40 value 42.881028
## iter  50 value 41.934715
## iter  60 value 41.791698
## iter  70 value 41.788300
## iter  80 value 41.784319
## iter  90 value 41.780836
## final  value 41.780786 
## converged
## # weights:  91
## initial  value 316.991702 
## iter  10 value 53.573340
## iter  20 value 33.698540
## iter  30 value 23.533359
## iter  40 value 20.999849
## iter  50 value 19.058390
## iter  60 value 18.848018
## iter  70 value 18.623635
## iter  80 value 18.444452
## iter  90 value 18.304483
## iter 100 value 18.224000
## final  value 18.224000 
## stopped after 100 iterations
## # weights:  19
## initial  value 218.728435 
## iter  10 value 82.731279
## iter  20 value 66.545404
## iter  30 value 63.431875
## iter  40 value 63.309413
## final  value 63.309410 
## converged
## # weights:  55
## initial  value 229.227220 
## iter  10 value 67.024856
## iter  20 value 56.342505
## iter  30 value 54.737296
## iter  40 value 54.585414
## iter  50 value 54.557621
## iter  60 value 54.555953
## final  value 54.555838 
## converged
## # weights:  91
## initial  value 295.317038 
## iter  10 value 66.544107
## iter  20 value 54.899180
## iter  30 value 50.618323
## iter  40 value 49.353798
## iter  50 value 48.579221
## iter  60 value 48.079477
## iter  70 value 47.764943
## iter  80 value 47.606177
## iter  90 value 47.585161
## iter 100 value 47.581422
## final  value 47.581422 
## stopped after 100 iterations
## # weights:  19
## initial  value 349.448924 
## iter  10 value 69.208816
## iter  20 value 58.899945
## iter  30 value 57.852741
## iter  40 value 55.121104
## iter  50 value 54.663346
## iter  60 value 54.512344
## iter  70 value 54.455280
## iter  80 value 54.413896
## iter  90 value 54.408472
## iter 100 value 54.402627
## final  value 54.402627 
## stopped after 100 iterations
## # weights:  55
## initial  value 195.207010 
## iter  10 value 50.071648
## iter  20 value 32.857156
## iter  30 value 27.318361
## iter  40 value 26.868216
## iter  50 value 26.576187
## iter  60 value 26.407138
## iter  70 value 26.351063
## iter  80 value 26.311904
## iter  90 value 26.284679
## iter 100 value 26.262165
## final  value 26.262165 
## stopped after 100 iterations
## # weights:  91
## initial  value 308.015371 
## iter  10 value 54.478756
## iter  20 value 32.704219
## iter  30 value 24.789417
## iter  40 value 20.092643
## iter  50 value 17.992633
## iter  60 value 16.505467
## iter  70 value 14.847475
## iter  80 value 13.897023
## iter  90 value 13.572274
## iter 100 value 13.471947
## final  value 13.471947 
## stopped after 100 iterations
## # weights:  19
## initial  value 246.993544 
## iter  10 value 64.854400
## iter  20 value 50.831617
## iter  30 value 49.190357
## iter  40 value 48.653893
## iter  50 value 45.938731
## iter  60 value 43.462624
## final  value 43.452623 
## converged
## # weights:  55
## initial  value 363.282004 
## iter  10 value 76.912813
## iter  20 value 50.163897
## iter  30 value 36.187624
## iter  40 value 31.840189
## iter  50 value 29.355228
## iter  60 value 27.350450
## iter  70 value 26.921066
## iter  80 value 26.795160
## iter  90 value 26.760821
## iter 100 value 26.725126
## final  value 26.725126 
## stopped after 100 iterations
## # weights:  91
## initial  value 242.132643 
## iter  10 value 44.018221
## iter  20 value 26.980971
## iter  30 value 17.122800
## iter  40 value 13.879825
## iter  50 value 12.878408
## iter  60 value 12.227754
## iter  70 value 11.907123
## iter  80 value 11.830168
## iter  90 value 11.826454
## iter  90 value 11.826454
## iter  90 value 11.826454
## final  value 11.826454 
## converged
## # weights:  19
## initial  value 386.993634 
## iter  10 value 74.225034
## iter  20 value 60.485415
## iter  30 value 58.237655
## iter  40 value 58.144830
## final  value 58.144794 
## converged
## # weights:  55
## initial  value 357.063125 
## iter  10 value 61.643764
## iter  20 value 50.768323
## iter  30 value 47.630270
## iter  40 value 47.256482
## iter  50 value 47.190805
## final  value 47.186377 
## converged
## # weights:  91
## initial  value 173.103810 
## iter  10 value 58.160232
## iter  20 value 48.402269
## iter  30 value 44.975413
## iter  40 value 44.259588
## iter  50 value 43.854557
## iter  60 value 43.406943
## iter  70 value 43.249164
## iter  80 value 43.111895
## iter  90 value 43.096889
## iter 100 value 43.096250
## final  value 43.096250 
## stopped after 100 iterations
## # weights:  19
## initial  value 267.733736 
## iter  10 value 57.809824
## iter  20 value 48.559612
## iter  30 value 47.099040
## iter  40 value 43.725276
## iter  50 value 43.604019
## iter  60 value 43.474013
## iter  70 value 43.450353
## iter  80 value 43.435979
## iter  90 value 43.432528
## iter 100 value 43.430993
## final  value 43.430993 
## stopped after 100 iterations
## # weights:  55
## initial  value 162.802360 
## iter  10 value 56.896811
## iter  20 value 41.122090
## iter  30 value 34.489656
## iter  40 value 31.019596
## iter  50 value 29.035210
## iter  60 value 27.425231
## iter  70 value 25.438924
## iter  80 value 21.241751
## iter  90 value 19.006461
## iter 100 value 18.084083
## final  value 18.084083 
## stopped after 100 iterations
## # weights:  91
## initial  value 268.422111 
## iter  10 value 41.424717
## iter  20 value 25.821627
## iter  30 value 18.530745
## iter  40 value 16.511387
## iter  50 value 15.730457
## iter  60 value 15.535561
## iter  70 value 15.285947
## iter  80 value 14.975382
## iter  90 value 14.525232
## iter 100 value 14.352778
## final  value 14.352778 
## stopped after 100 iterations
## # weights:  19
## initial  value 335.485174 
## iter  10 value 88.786917
## iter  20 value 72.994772
## iter  30 value 70.186378
## iter  40 value 69.179882
## iter  50 value 68.740282
## iter  60 value 68.309508
## iter  70 value 68.090210
## iter  80 value 67.957336
## iter  90 value 67.906911
## iter 100 value 67.835956
## final  value 67.835956 
## stopped after 100 iterations
## # weights:  55
## initial  value 246.591173 
## iter  10 value 69.450271
## iter  20 value 39.853947
## iter  30 value 32.466321
## iter  40 value 30.571878
## iter  50 value 29.883769
## iter  60 value 28.908355
## iter  70 value 28.411610
## iter  80 value 28.358369
## iter  90 value 28.332736
## iter 100 value 28.323511
## final  value 28.323511 
## stopped after 100 iterations
## # weights:  91
## initial  value 239.013890 
## iter  10 value 58.820510
## iter  20 value 45.325140
## iter  30 value 30.496884
## iter  40 value 19.914527
## iter  50 value 17.891952
## iter  60 value 17.650758
## iter  70 value 17.636494
## iter  80 value 17.618980
## iter  90 value 17.614161
## iter 100 value 17.612108
## final  value 17.612108 
## stopped after 100 iterations
## # weights:  19
## initial  value 395.079760 
## iter  10 value 89.270756
## iter  20 value 71.086782
## iter  30 value 66.752067
## iter  40 value 66.580751
## iter  50 value 66.578693
## final  value 66.578573 
## converged
## # weights:  55
## initial  value 425.717668 
## iter  10 value 72.045461
## iter  20 value 63.907918
## iter  30 value 61.481175
## iter  40 value 59.120582
## iter  50 value 57.522377
## iter  60 value 57.094400
## iter  70 value 55.193180
## iter  80 value 52.794128
## iter  90 value 52.157176
## iter 100 value 52.129110
## final  value 52.129110 
## stopped after 100 iterations
## # weights:  91
## initial  value 437.131068 
## iter  10 value 69.438174
## iter  20 value 54.590007
## iter  30 value 50.646110
## iter  40 value 49.858702
## iter  50 value 49.293383
## iter  60 value 48.827176
## iter  70 value 48.599183
## iter  80 value 48.460784
## iter  90 value 48.242940
## iter 100 value 48.179829
## final  value 48.179829 
## stopped after 100 iterations
## # weights:  19
## initial  value 298.194284 
## iter  10 value 74.782861
## iter  20 value 62.257616
## iter  30 value 59.877646
## iter  40 value 56.697471
## iter  50 value 55.096987
## iter  60 value 53.140371
## iter  70 value 50.923179
## iter  80 value 50.818069
## iter  90 value 50.654575
## iter 100 value 50.641381
## final  value 50.641381 
## stopped after 100 iterations
## # weights:  55
## initial  value 274.434827 
## iter  10 value 60.915074
## iter  20 value 37.323125
## iter  30 value 34.602924
## iter  40 value 34.011539
## iter  50 value 33.821015
## iter  60 value 33.568655
## iter  70 value 33.411168
## iter  80 value 33.373966
## iter  90 value 33.363890
## iter 100 value 33.356627
## final  value 33.356627 
## stopped after 100 iterations
## # weights:  91
## initial  value 212.232202 
## iter  10 value 54.080758
## iter  20 value 32.769799
## iter  30 value 28.555251
## iter  40 value 20.270780
## iter  50 value 17.741457
## iter  60 value 15.314122
## iter  70 value 14.617242
## iter  80 value 14.194513
## iter  90 value 13.844330
## iter 100 value 13.763050
## final  value 13.763050 
## stopped after 100 iterations
## # weights:  19
## initial  value 292.726609 
## iter  10 value 69.913629
## iter  20 value 60.671262
## iter  30 value 59.551713
## iter  40 value 54.510130
## iter  50 value 53.099590
## iter  60 value 53.049305
## iter  70 value 53.031408
## iter  80 value 53.025609
## iter  90 value 53.021245
## iter 100 value 53.015743
## final  value 53.015743 
## stopped after 100 iterations
## # weights:  55
## initial  value 375.186805 
## iter  10 value 53.939380
## iter  20 value 44.308319
## iter  30 value 36.492661
## iter  40 value 34.396573
## iter  50 value 32.822644
## iter  60 value 31.792674
## iter  70 value 31.242835
## iter  80 value 30.871310
## iter  90 value 30.583173
## iter 100 value 30.318773
## final  value 30.318773 
## stopped after 100 iterations
## # weights:  91
## initial  value 368.005445 
## iter  10 value 51.908749
## iter  20 value 28.844593
## iter  30 value 22.949580
## iter  40 value 21.293289
## iter  50 value 20.460648
## iter  60 value 19.922601
## iter  70 value 19.853200
## iter  80 value 19.821346
## final  value 19.821152 
## converged
## # weights:  19
## initial  value 202.900189 
## iter  10 value 77.840766
## iter  20 value 66.982855
## iter  30 value 63.774393
## iter  40 value 63.722049
## iter  40 value 63.722049
## final  value 63.722049 
## converged
## # weights:  55
## initial  value 232.610337 
## iter  10 value 67.132294
## iter  20 value 56.496969
## iter  30 value 53.451045
## iter  40 value 52.990462
## iter  50 value 52.513235
## iter  60 value 52.479696
## final  value 52.479281 
## converged
## # weights:  91
## initial  value 201.759972 
## iter  10 value 64.826158
## iter  20 value 56.088533
## iter  30 value 53.149304
## iter  40 value 50.475047
## iter  50 value 48.174399
## iter  60 value 47.500128
## iter  70 value 47.350342
## iter  80 value 47.345291
## iter  90 value 47.345245
## final  value 47.345240 
## converged
## # weights:  19
## initial  value 231.272813 
## iter  10 value 79.426334
## iter  20 value 67.902903
## iter  30 value 60.321314
## iter  40 value 56.750669
## iter  50 value 54.757981
## iter  60 value 54.151059
## iter  70 value 54.097815
## iter  80 value 54.007340
## iter  90 value 53.943353
## iter 100 value 53.921854
## final  value 53.921854 
## stopped after 100 iterations
## # weights:  55
## initial  value 237.133547 
## iter  10 value 75.079962
## iter  20 value 53.155182
## iter  30 value 47.933497
## iter  40 value 45.056141
## iter  50 value 43.108208
## iter  60 value 42.258520
## iter  70 value 41.493179
## iter  80 value 41.055758
## iter  90 value 38.888900
## iter 100 value 36.960453
## final  value 36.960453 
## stopped after 100 iterations
## # weights:  91
## initial  value 304.133481 
## iter  10 value 56.187131
## iter  20 value 41.816169
## iter  30 value 32.720014
## iter  40 value 31.615542
## iter  50 value 30.843557
## iter  60 value 29.440065
## iter  70 value 27.536155
## iter  80 value 27.131743
## iter  90 value 26.765345
## iter 100 value 25.620792
## final  value 25.620792 
## stopped after 100 iterations
## # weights:  19
## initial  value 176.659804 
## iter  10 value 74.746276
## iter  20 value 58.712423
## iter  30 value 55.977648
## iter  40 value 55.691848
## iter  50 value 55.691301
## iter  60 value 55.690993
## final  value 55.690988 
## converged
## # weights:  55
## initial  value 205.389557 
## iter  10 value 52.897015
## iter  20 value 42.388630
## iter  30 value 38.892629
## iter  40 value 38.262428
## iter  50 value 38.057104
## iter  60 value 37.860433
## iter  70 value 37.501500
## iter  80 value 37.359268
## iter  90 value 37.206188
## iter 100 value 37.095502
## final  value 37.095502 
## stopped after 100 iterations
## # weights:  91
## initial  value 177.626786 
## iter  10 value 52.740711
## iter  20 value 34.602275
## iter  30 value 24.809587
## iter  40 value 18.516754
## iter  50 value 17.827873
## iter  60 value 17.721335
## iter  70 value 17.618842
## iter  80 value 17.596688
## iter  90 value 17.583429
## iter 100 value 17.415425
## final  value 17.415425 
## stopped after 100 iterations
## # weights:  19
## initial  value 252.638539 
## iter  10 value 77.422512
## iter  20 value 65.516822
## iter  30 value 64.253318
## final  value 64.248237 
## converged
## # weights:  55
## initial  value 333.153552 
## iter  10 value 77.709091
## iter  20 value 55.422291
## iter  30 value 51.162756
## iter  40 value 50.539515
## iter  50 value 50.120923
## iter  60 value 49.868815
## iter  70 value 49.828661
## final  value 49.828660 
## converged
## # weights:  91
## initial  value 254.399185 
## iter  10 value 68.990007
## iter  20 value 53.830781
## iter  30 value 50.517721
## iter  40 value 49.629947
## iter  50 value 48.626228
## iter  60 value 48.434094
## iter  70 value 48.341156
## iter  80 value 48.319947
## iter  90 value 48.312915
## final  value 48.312872 
## converged
## # weights:  19
## initial  value 268.054208 
## iter  10 value 81.070256
## iter  20 value 62.621370
## iter  30 value 59.153959
## iter  40 value 58.803400
## iter  50 value 56.657510
## iter  60 value 53.646139
## iter  70 value 51.090887
## iter  80 value 47.887358
## iter  90 value 47.814254
## iter 100 value 47.707347
## final  value 47.707347 
## stopped after 100 iterations
## # weights:  55
## initial  value 290.680831 
## iter  10 value 57.417080
## iter  20 value 44.541535
## iter  30 value 39.008722
## iter  40 value 34.547353
## iter  50 value 34.047702
## iter  60 value 33.817719
## iter  70 value 33.744820
## iter  80 value 33.651535
## iter  90 value 33.577566
## iter 100 value 33.491793
## final  value 33.491793 
## stopped after 100 iterations
## # weights:  91
## initial  value 301.228360 
## iter  10 value 59.377766
## iter  20 value 41.227684
## iter  30 value 35.890780
## iter  40 value 29.665137
## iter  50 value 21.087438
## iter  60 value 18.401496
## iter  70 value 18.056371
## iter  80 value 17.744188
## iter  90 value 17.522298
## iter 100 value 17.272291
## final  value 17.272291 
## stopped after 100 iterations
## # weights:  19
## initial  value 308.510267 
## iter  10 value 65.175531
## iter  20 value 57.156403
## iter  30 value 53.838251
## iter  40 value 50.757632
## iter  50 value 45.887542
## iter  60 value 42.102801
## iter  70 value 42.089890
## final  value 42.089703 
## converged
## # weights:  55
## initial  value 506.865483 
## iter  10 value 56.454472
## iter  20 value 43.933651
## iter  30 value 34.364813
## iter  40 value 25.631085
## iter  50 value 22.448985
## iter  60 value 22.260628
## iter  70 value 22.157452
## iter  80 value 22.151375
## iter  90 value 22.132775
## iter 100 value 22.069637
## final  value 22.069637 
## stopped after 100 iterations
## # weights:  91
## initial  value 246.560998 
## iter  10 value 47.404136
## iter  20 value 33.095846
## iter  30 value 26.768508
## iter  40 value 24.783396
## iter  50 value 24.189613
## iter  60 value 23.934137
## iter  70 value 23.608166
## iter  80 value 23.391239
## iter  90 value 22.605924
## iter 100 value 20.563030
## final  value 20.563030 
## stopped after 100 iterations
## # weights:  19
## initial  value 233.928529 
## iter  10 value 81.987972
## iter  20 value 65.106444
## iter  30 value 63.242023
## iter  40 value 63.086571
## final  value 63.086550 
## converged
## # weights:  55
## initial  value 419.993554 
## iter  10 value 74.639699
## iter  20 value 61.248177
## iter  30 value 55.540710
## iter  40 value 54.054726
## iter  50 value 53.731017
## iter  60 value 53.607004
## iter  70 value 53.513178
## iter  80 value 53.501086
## final  value 53.501057 
## converged
## # weights:  91
## initial  value 575.569912 
## iter  10 value 68.409659
## iter  20 value 54.144896
## iter  30 value 52.063758
## iter  40 value 51.197946
## iter  50 value 50.772029
## iter  60 value 50.509610
## iter  70 value 49.968309
## iter  80 value 49.340129
## iter  90 value 49.171858
## iter 100 value 49.164422
## final  value 49.164422 
## stopped after 100 iterations
## # weights:  19
## initial  value 326.418925 
## iter  10 value 71.496288
## iter  20 value 58.906654
## iter  30 value 56.879382
## iter  40 value 55.397204
## iter  50 value 54.653339
## iter  60 value 53.961206
## iter  70 value 53.827769
## iter  80 value 53.808568
## iter  90 value 53.802250
## iter 100 value 53.796787
## final  value 53.796787 
## stopped after 100 iterations
## # weights:  55
## initial  value 356.966037 
## iter  10 value 57.481627
## iter  20 value 47.387121
## iter  30 value 42.258523
## iter  40 value 41.182159
## iter  50 value 41.086212
## iter  60 value 41.046748
## iter  70 value 41.014348
## iter  80 value 40.890582
## iter  90 value 40.752865
## iter 100 value 40.690041
## final  value 40.690041 
## stopped after 100 iterations
## # weights:  91
## initial  value 433.221101 
## iter  10 value 58.853868
## iter  20 value 37.244193
## iter  30 value 31.042684
## iter  40 value 29.657283
## iter  50 value 29.245143
## iter  60 value 29.162437
## iter  70 value 29.117516
## iter  80 value 29.055515
## iter  90 value 29.030668
## iter 100 value 28.991216
## final  value 28.991216 
## stopped after 100 iterations
## # weights:  19
## initial  value 226.359644 
## iter  10 value 83.421895
## iter  20 value 57.601544
## iter  30 value 55.636622
## iter  40 value 51.539011
## iter  50 value 50.996333
## iter  60 value 50.986604
## iter  70 value 50.976768
## iter  80 value 50.975552
## iter  90 value 50.975008
## iter 100 value 50.973890
## final  value 50.973890 
## stopped after 100 iterations
## # weights:  55
## initial  value 254.606484 
## iter  10 value 55.796377
## iter  20 value 43.282493
## iter  30 value 30.808571
## iter  40 value 25.870439
## iter  50 value 24.340801
## iter  60 value 21.542650
## iter  70 value 21.307249
## iter  80 value 21.076943
## iter  90 value 20.978079
## iter 100 value 20.891097
## final  value 20.891097 
## stopped after 100 iterations
## # weights:  91
## initial  value 266.687156 
## iter  10 value 52.766730
## iter  20 value 34.290241
## iter  30 value 18.394648
## iter  40 value 15.739678
## iter  50 value 14.757027
## iter  60 value 14.473370
## iter  70 value 14.430354
## iter  80 value 14.426960
## iter  90 value 14.425404
## iter 100 value 14.424261
## final  value 14.424261 
## stopped after 100 iterations
## # weights:  19
## initial  value 243.784791 
## iter  10 value 69.420989
## iter  20 value 63.632686
## iter  30 value 63.176734
## final  value 63.175660 
## converged
## # weights:  55
## initial  value 231.790367 
## iter  10 value 66.962946
## iter  20 value 54.097152
## iter  30 value 48.524278
## iter  40 value 48.054175
## iter  50 value 47.938968
## iter  60 value 47.936764
## final  value 47.936697 
## converged
## # weights:  91
## initial  value 225.554503 
## iter  10 value 64.249536
## iter  20 value 49.836253
## iter  30 value 45.149617
## iter  40 value 43.173307
## iter  50 value 42.860880
## iter  60 value 42.765625
## iter  70 value 42.748458
## iter  80 value 42.745683
## iter  90 value 42.745641
## final  value 42.745640 
## converged
## # weights:  19
## initial  value 376.020243 
## iter  10 value 72.785470
## iter  20 value 56.933482
## iter  30 value 55.552897
## iter  40 value 54.286446
## iter  50 value 54.103831
## iter  60 value 53.996203
## iter  70 value 53.336708
## iter  80 value 53.018497
## iter  90 value 53.014349
## final  value 53.003529 
## converged
## # weights:  55
## initial  value 228.597390 
## iter  10 value 52.531193
## iter  20 value 39.977729
## iter  30 value 33.029947
## iter  40 value 32.095126
## iter  50 value 31.611059
## iter  60 value 31.537681
## iter  70 value 31.494439
## iter  80 value 31.314692
## iter  90 value 31.014655
## iter 100 value 30.977082
## final  value 30.977082 
## stopped after 100 iterations
## # weights:  91
## initial  value 389.519148 
## iter  10 value 46.969750
## iter  20 value 25.260254
## iter  30 value 16.263974
## iter  40 value 15.032825
## iter  50 value 14.794538
## iter  60 value 14.607409
## iter  70 value 14.506943
## iter  80 value 14.426376
## iter  90 value 14.332939
## iter 100 value 12.320685
## final  value 12.320685 
## stopped after 100 iterations
## # weights:  19
## initial  value 404.435684 
## iter  10 value 63.851592
## iter  20 value 47.622620
## iter  30 value 45.708615
## iter  40 value 41.289196
## iter  50 value 41.161106
## iter  60 value 41.079461
## iter  70 value 41.037109
## iter  80 value 41.003518
## iter  90 value 40.991750
## iter 100 value 40.985880
## final  value 40.985880 
## stopped after 100 iterations
## # weights:  55
## initial  value 445.622516 
## iter  10 value 62.288557
## iter  20 value 37.581901
## iter  30 value 32.008241
## iter  40 value 30.865928
## iter  50 value 29.833520
## iter  60 value 29.305165
## iter  70 value 29.031236
## iter  80 value 29.023545
## iter  90 value 29.013768
## iter 100 value 28.864291
## final  value 28.864291 
## stopped after 100 iterations
## # weights:  91
## initial  value 299.653375 
## iter  10 value 56.076666
## iter  20 value 36.684342
## iter  30 value 16.573535
## iter  40 value 11.523634
## iter  50 value 10.960363
## iter  60 value 10.835304
## iter  70 value 10.799262
## iter  80 value 10.777075
## iter  90 value 10.751904
## iter 100 value 10.739397
## final  value 10.739397 
## stopped after 100 iterations
## # weights:  19
## initial  value 256.771574 
## iter  10 value 81.351277
## iter  20 value 63.035346
## iter  30 value 59.753031
## iter  40 value 59.568053
## final  value 59.568049 
## converged
## # weights:  55
## initial  value 246.745356 
## iter  10 value 67.535588
## iter  20 value 53.520296
## iter  30 value 50.410631
## iter  40 value 49.903607
## iter  50 value 49.834180
## iter  60 value 49.715581
## iter  70 value 49.650781
## iter  80 value 49.619211
## iter  90 value 49.615374
## final  value 49.615234 
## converged
## # weights:  91
## initial  value 265.788969 
## iter  10 value 62.393216
## iter  20 value 48.888391
## iter  30 value 46.389413
## iter  40 value 45.842192
## iter  50 value 45.466308
## iter  60 value 45.372780
## iter  70 value 45.275407
## iter  80 value 45.219046
## iter  90 value 44.883669
## iter 100 value 44.813713
## final  value 44.813713 
## stopped after 100 iterations
## # weights:  19
## initial  value 256.849167 
## iter  10 value 70.630037
## iter  20 value 62.484450
## iter  30 value 51.678077
## iter  40 value 49.903268
## iter  50 value 47.723206
## iter  60 value 47.668103
## iter  70 value 47.477461
## iter  80 value 47.421828
## iter  90 value 47.405091
## iter 100 value 47.404375
## final  value 47.404375 
## stopped after 100 iterations
## # weights:  55
## initial  value 175.720942 
## iter  10 value 62.543955
## iter  20 value 49.125390
## iter  30 value 36.156279
## iter  40 value 28.030350
## iter  50 value 27.121846
## iter  60 value 26.950689
## iter  70 value 26.822914
## iter  80 value 26.781882
## iter  90 value 26.766154
## iter 100 value 26.743199
## final  value 26.743199 
## stopped after 100 iterations
## # weights:  91
## initial  value 223.327952 
## iter  10 value 46.047272
## iter  20 value 25.873462
## iter  30 value 18.601254
## iter  40 value 13.071544
## iter  50 value 12.298073
## iter  60 value 11.922616
## iter  70 value 11.678080
## iter  80 value 11.368248
## iter  90 value 11.133273
## iter 100 value 10.970552
## final  value 10.970552 
## stopped after 100 iterations
## # weights:  19
## initial  value 330.969961 
## iter  10 value 87.058372
## iter  20 value 73.237998
## iter  30 value 65.447819
## iter  40 value 61.160233
## iter  50 value 57.332805
## iter  60 value 55.797939
## iter  70 value 54.725563
## iter  80 value 54.277106
## iter  90 value 54.127989
## iter 100 value 53.207594
## final  value 53.207594 
## stopped after 100 iterations
## # weights:  55
## initial  value 351.819312 
## iter  10 value 67.400613
## iter  20 value 50.579067
## iter  30 value 46.302666
## iter  40 value 43.765555
## iter  50 value 41.774631
## iter  60 value 40.852661
## iter  70 value 39.840004
## iter  80 value 38.889692
## iter  90 value 36.209944
## iter 100 value 35.226811
## final  value 35.226811 
## stopped after 100 iterations
## # weights:  91
## initial  value 203.973083 
## iter  10 value 51.255658
## iter  20 value 34.004862
## iter  30 value 22.665597
## iter  40 value 20.020491
## iter  50 value 17.655464
## iter  60 value 17.256489
## iter  70 value 17.118147
## iter  80 value 16.982694
## iter  90 value 16.945811
## iter 100 value 16.929445
## final  value 16.929445 
## stopped after 100 iterations
## # weights:  19
## initial  value 181.877420 
## iter  10 value 88.790427
## iter  20 value 71.693908
## iter  30 value 68.015926
## iter  40 value 67.374472
## iter  50 value 67.368610
## final  value 67.368403 
## converged
## # weights:  55
## initial  value 401.717217 
## iter  10 value 74.795276
## iter  20 value 61.114334
## iter  30 value 56.293936
## iter  40 value 54.769175
## iter  50 value 54.353985
## iter  60 value 54.348155
## final  value 54.348076 
## converged
## # weights:  91
## initial  value 308.006191 
## iter  10 value 65.322943
## iter  20 value 55.614159
## iter  30 value 52.138236
## iter  40 value 51.458945
## iter  50 value 50.703791
## iter  60 value 50.329745
## iter  70 value 49.531731
## iter  80 value 49.094528
## iter  90 value 48.967336
## iter 100 value 48.858398
## final  value 48.858398 
## stopped after 100 iterations
## # weights:  19
## initial  value 180.350025 
## iter  10 value 83.818034
## iter  20 value 65.183664
## iter  30 value 60.543547
## iter  40 value 59.438830
## iter  50 value 59.271991
## iter  60 value 58.542066
## iter  70 value 58.295300
## iter  80 value 58.242745
## iter  90 value 58.222294
## iter 100 value 58.214114
## final  value 58.214114 
## stopped after 100 iterations
## # weights:  55
## initial  value 236.032368 
## iter  10 value 55.506492
## iter  20 value 46.597901
## iter  30 value 40.327944
## iter  40 value 39.742530
## iter  50 value 39.528536
## iter  60 value 39.452433
## iter  70 value 39.357597
## iter  80 value 39.152169
## iter  90 value 39.083448
## iter 100 value 39.032604
## final  value 39.032604 
## stopped after 100 iterations
## # weights:  91
## initial  value 330.569032 
## iter  10 value 74.512851
## iter  20 value 50.084702
## iter  30 value 36.491565
## iter  40 value 28.558547
## iter  50 value 25.719708
## iter  60 value 23.523773
## iter  70 value 21.287419
## iter  80 value 20.560929
## iter  90 value 19.819595
## iter 100 value 18.458991
## final  value 18.458991 
## stopped after 100 iterations
## # weights:  19
## initial  value 300.324998 
## iter  10 value 80.204794
## iter  20 value 52.285119
## iter  30 value 50.654482
## iter  40 value 49.939125
## iter  50 value 48.993752
## iter  60 value 43.350994
## iter  70 value 42.091219
## final  value 42.088122 
## converged
## # weights:  55
## initial  value 211.541823 
## iter  10 value 50.573964
## iter  20 value 30.954622
## iter  30 value 24.808440
## iter  40 value 22.942432
## iter  50 value 22.429798
## iter  60 value 22.428226
## iter  70 value 22.427580
## final  value 22.427578 
## converged
## # weights:  91
## initial  value 424.472139 
## iter  10 value 48.535058
## iter  20 value 29.211769
## iter  30 value 22.570950
## iter  40 value 17.612961
## iter  50 value 17.075583
## iter  60 value 16.324980
## iter  70 value 14.809099
## iter  80 value 14.775694
## iter  90 value 14.763743
## iter 100 value 14.743411
## final  value 14.743411 
## stopped after 100 iterations
## # weights:  19
## initial  value 273.126146 
## iter  10 value 78.188293
## iter  20 value 63.997637
## iter  30 value 60.622828
## iter  40 value 60.546126
## iter  50 value 60.545315
## final  value 60.545287 
## converged
## # weights:  55
## initial  value 257.580057 
## iter  10 value 64.791838
## iter  20 value 52.603570
## iter  30 value 49.888547
## iter  40 value 49.631610
## iter  50 value 49.539166
## iter  60 value 49.531256
## final  value 49.531115 
## converged
## # weights:  91
## initial  value 207.755418 
## iter  10 value 62.133957
## iter  20 value 51.204955
## iter  30 value 47.585307
## iter  40 value 46.599759
## iter  50 value 45.944210
## iter  60 value 45.604309
## iter  70 value 45.534341
## iter  80 value 45.404186
## iter  90 value 45.136840
## iter 100 value 45.082250
## final  value 45.082250 
## stopped after 100 iterations
## # weights:  19
## initial  value 295.598643 
## iter  10 value 76.064918
## iter  20 value 67.872210
## iter  30 value 59.553205
## iter  40 value 54.567292
## iter  50 value 53.962741
## iter  60 value 53.370415
## iter  70 value 52.505679
## iter  80 value 51.380738
## iter  90 value 50.819594
## iter 100 value 50.620348
## final  value 50.620348 
## stopped after 100 iterations
## # weights:  55
## initial  value 248.420492 
## iter  10 value 52.585959
## iter  20 value 35.484848
## iter  30 value 29.493675
## iter  40 value 26.707059
## iter  50 value 26.168102
## iter  60 value 25.367729
## iter  70 value 24.965798
## iter  80 value 24.927108
## iter  90 value 24.900908
## iter 100 value 24.882545
## final  value 24.882545 
## stopped after 100 iterations
## # weights:  91
## initial  value 218.943635 
## iter  10 value 49.409875
## iter  20 value 29.449084
## iter  30 value 18.224624
## iter  40 value 16.297601
## iter  50 value 15.571202
## iter  60 value 15.388191
## iter  70 value 15.207332
## iter  80 value 15.001048
## iter  90 value 14.864986
## iter 100 value 14.636364
## final  value 14.636364 
## stopped after 100 iterations
## # weights:  19
## initial  value 207.996285 
## iter  10 value 79.077360
## iter  20 value 64.844734
## iter  30 value 61.736270
## iter  40 value 58.752446
## iter  50 value 54.508072
## iter  60 value 50.378106
## iter  70 value 48.015654
## iter  80 value 47.671427
## iter  90 value 46.425201
## iter 100 value 44.812281
## final  value 44.812281 
## stopped after 100 iterations
## # weights:  55
## initial  value 222.134792 
## iter  10 value 68.089802
## iter  20 value 43.416495
## iter  30 value 38.311202
## iter  40 value 35.733371
## iter  50 value 32.636071
## iter  60 value 32.250347
## iter  70 value 31.976219
## iter  80 value 31.755813
## iter  90 value 31.224638
## iter 100 value 30.782517
## final  value 30.782517 
## stopped after 100 iterations
## # weights:  91
## initial  value 382.742903 
## iter  10 value 52.238874
## iter  20 value 33.057543
## iter  30 value 28.686582
## iter  40 value 24.870190
## iter  50 value 20.379668
## iter  60 value 19.904424
## iter  70 value 18.840900
## iter  80 value 18.219500
## iter  90 value 17.775244
## iter 100 value 17.754622
## final  value 17.754622 
## stopped after 100 iterations
## # weights:  19
## initial  value 252.967623 
## iter  10 value 82.616214
## iter  20 value 69.082069
## iter  30 value 65.044497
## iter  40 value 62.764979
## iter  50 value 62.764684
## final  value 62.764664 
## converged
## # weights:  55
## initial  value 409.580343 
## iter  10 value 71.207776
## iter  20 value 60.416250
## iter  30 value 56.803928
## iter  40 value 53.448736
## iter  50 value 52.826447
## iter  60 value 51.262775
## iter  70 value 51.078774
## iter  80 value 51.045530
## iter  90 value 51.023533
## iter 100 value 50.893222
## final  value 50.893222 
## stopped after 100 iterations
## # weights:  91
## initial  value 223.144644 
## iter  10 value 68.669361
## iter  20 value 56.880986
## iter  30 value 49.101580
## iter  40 value 47.648117
## iter  50 value 47.056871
## iter  60 value 46.861067
## iter  70 value 46.831034
## iter  80 value 46.808539
## iter  90 value 46.788499
## iter 100 value 46.786652
## final  value 46.786652 
## stopped after 100 iterations
## # weights:  19
## initial  value 421.601703 
## iter  10 value 95.521194
## iter  20 value 65.003727
## iter  30 value 59.000571
## iter  40 value 57.956878
## iter  50 value 56.204125
## iter  60 value 52.668580
## iter  70 value 49.722915
## iter  80 value 47.629099
## iter  90 value 47.382124
## iter 100 value 47.194624
## final  value 47.194624 
## stopped after 100 iterations
## # weights:  55
## initial  value 303.759753 
## iter  10 value 66.133985
## iter  20 value 51.713646
## iter  30 value 42.057181
## iter  40 value 38.494273
## iter  50 value 36.144959
## iter  60 value 35.394672
## iter  70 value 35.204228
## iter  80 value 35.090386
## iter  90 value 35.049038
## iter 100 value 35.019636
## final  value 35.019636 
## stopped after 100 iterations
## # weights:  91
## initial  value 225.755716 
## iter  10 value 55.774191
## iter  20 value 32.609176
## iter  30 value 28.111265
## iter  40 value 26.032517
## iter  50 value 25.423900
## iter  60 value 24.814228
## iter  70 value 24.243204
## iter  80 value 24.056139
## iter  90 value 23.488186
## iter 100 value 22.366168
## final  value 22.366168 
## stopped after 100 iterations
## # weights:  19
## initial  value 401.298318 
## iter  10 value 111.264475
## iter  20 value 91.801809
## iter  30 value 90.645145
## iter  40 value 86.835528
## iter  50 value 72.395869
## iter  60 value 62.097038
## iter  70 value 52.377797
## iter  80 value 51.444460
## iter  90 value 51.194684
## iter 100 value 47.835794
## final  value 47.835794 
## stopped after 100 iterations
## # weights:  55
## initial  value 162.605581 
## iter  10 value 46.271786
## iter  20 value 34.005835
## iter  30 value 31.191578
## iter  40 value 29.383663
## iter  50 value 29.084689
## iter  60 value 28.975279
## iter  70 value 28.974235
## final  value 28.921651 
## converged
## # weights:  91
## initial  value 426.407712 
## iter  10 value 49.153279
## iter  20 value 25.129812
## iter  30 value 17.647181
## iter  40 value 15.066686
## iter  50 value 13.814019
## iter  60 value 13.539550
## iter  70 value 13.470047
## iter  80 value 13.323907
## iter  90 value 13.287563
## iter 100 value 13.284517
## final  value 13.284517 
## stopped after 100 iterations
## # weights:  19
## initial  value 247.630632 
## iter  10 value 77.398400
## iter  20 value 63.492406
## iter  30 value 62.303338
## iter  40 value 62.270679
## iter  40 value 62.270679
## final  value 62.270679 
## converged
## # weights:  55
## initial  value 347.433601 
## iter  10 value 66.926217
## iter  20 value 55.816776
## iter  30 value 53.511411
## iter  40 value 52.316689
## iter  50 value 51.850208
## iter  60 value 51.638296
## iter  70 value 51.635878
## final  value 51.635746 
## converged
## # weights:  91
## initial  value 215.048558 
## iter  10 value 63.710233
## iter  20 value 53.732687
## iter  30 value 51.596878
## iter  40 value 49.641423
## iter  50 value 48.472978
## iter  60 value 46.661905
## iter  70 value 46.099980
## iter  80 value 45.699502
## iter  90 value 45.578829
## iter 100 value 45.577084
## final  value 45.577084 
## stopped after 100 iterations
## # weights:  19
## initial  value 415.818633 
## iter  10 value 119.695298
## iter  20 value 97.058014
## iter  30 value 91.631151
## iter  40 value 90.113678
## iter  50 value 83.703949
## iter  60 value 74.254033
## iter  70 value 62.653395
## iter  80 value 53.627920
## iter  90 value 52.193756
## iter 100 value 51.069083
## final  value 51.069083 
## stopped after 100 iterations
## # weights:  55
## initial  value 231.403628 
## iter  10 value 73.251443
## iter  20 value 43.959602
## iter  30 value 34.211322
## iter  40 value 29.012719
## iter  50 value 26.707419
## iter  60 value 22.147226
## iter  70 value 21.202795
## iter  80 value 20.040647
## iter  90 value 19.300483
## iter 100 value 18.952596
## final  value 18.952596 
## stopped after 100 iterations
## # weights:  91
## initial  value 170.565488 
## iter  10 value 50.790114
## iter  20 value 32.266398
## iter  30 value 22.199762
## iter  40 value 19.869619
## iter  50 value 19.065767
## iter  60 value 18.367916
## iter  70 value 18.054521
## iter  80 value 17.833111
## iter  90 value 17.714587
## iter 100 value 17.620275
## final  value 17.620275 
## stopped after 100 iterations
## # weights:  19
## initial  value 286.140516 
## iter  10 value 67.968321
## iter  20 value 60.050114
## iter  30 value 57.704353
## iter  40 value 56.453216
## iter  50 value 51.086355
## iter  60 value 48.772503
## iter  70 value 48.745171
## iter  80 value 48.731062
## iter  90 value 48.698778
## iter 100 value 48.690023
## final  value 48.690023 
## stopped after 100 iterations
## # weights:  55
## initial  value 433.619397 
## iter  10 value 57.610825
## iter  20 value 42.786252
## iter  30 value 31.444265
## iter  40 value 27.802331
## iter  50 value 26.940740
## iter  60 value 26.813185
## iter  70 value 26.760696
## iter  80 value 26.743923
## iter  90 value 26.704289
## iter 100 value 26.695546
## final  value 26.695546 
## stopped after 100 iterations
## # weights:  91
## initial  value 618.754562 
## iter  10 value 51.020057
## iter  20 value 32.018658
## iter  30 value 22.114147
## iter  40 value 18.894098
## iter  50 value 18.075407
## iter  60 value 17.493080
## iter  70 value 16.880867
## iter  80 value 16.348205
## iter  90 value 15.680411
## iter 100 value 15.002180
## final  value 15.002180 
## stopped after 100 iterations
## # weights:  19
## initial  value 310.423260 
## iter  10 value 76.434280
## iter  20 value 68.106404
## iter  30 value 66.571045
## iter  40 value 66.007422
## iter  50 value 65.807506
## final  value 65.786419 
## converged
## # weights:  55
## initial  value 366.735350 
## iter  10 value 70.751352
## iter  20 value 57.511960
## iter  30 value 52.424740
## iter  40 value 51.445611
## iter  50 value 51.082055
## iter  60 value 51.027755
## iter  70 value 51.026924
## final  value 51.026923 
## converged
## # weights:  91
## initial  value 204.304308 
## iter  10 value 66.444035
## iter  20 value 56.818583
## iter  30 value 53.193102
## iter  40 value 51.737480
## iter  50 value 51.116146
## iter  60 value 50.350090
## iter  70 value 48.967710
## iter  80 value 47.713449
## iter  90 value 47.342505
## iter 100 value 47.323737
## final  value 47.323737 
## stopped after 100 iterations
## # weights:  19
## initial  value 322.152977 
## iter  10 value 65.660011
## iter  20 value 61.645001
## iter  30 value 57.316104
## iter  40 value 55.264432
## iter  50 value 51.508195
## iter  60 value 51.203824
## iter  70 value 51.188498
## iter  80 value 51.185598
## iter  90 value 51.185337
## final  value 51.185332 
## converged
## # weights:  55
## initial  value 357.452502 
## iter  10 value 61.083439
## iter  20 value 46.375558
## iter  30 value 41.763029
## iter  40 value 39.209491
## iter  50 value 38.572570
## iter  60 value 38.115320
## iter  70 value 37.876056
## iter  80 value 37.776982
## iter  90 value 37.510648
## iter 100 value 37.378144
## final  value 37.378144 
## stopped after 100 iterations
## # weights:  91
## initial  value 318.431606 
## iter  10 value 52.403096
## iter  20 value 35.680106
## iter  30 value 31.314054
## iter  40 value 29.741903
## iter  50 value 29.085445
## iter  60 value 28.730705
## iter  70 value 28.529878
## iter  80 value 28.473597
## iter  90 value 28.412492
## iter 100 value 28.361008
## final  value 28.361008 
## stopped after 100 iterations
## # weights:  19
## initial  value 320.808748 
## iter  10 value 72.856907
## iter  20 value 58.070407
## iter  30 value 55.653038
## iter  40 value 53.024776
## iter  50 value 53.022427
## iter  60 value 53.021015
## iter  70 value 53.020318
## final  value 53.019822 
## converged
## # weights:  55
## initial  value 324.528319 
## iter  10 value 62.537678
## iter  20 value 52.166501
## iter  30 value 44.559839
## iter  40 value 41.197042
## iter  50 value 39.674757
## iter  60 value 38.303472
## iter  70 value 36.677108
## iter  80 value 36.073153
## iter  90 value 34.812846
## iter 100 value 33.625456
## final  value 33.625456 
## stopped after 100 iterations
## # weights:  91
## initial  value 335.773907 
## iter  10 value 62.784522
## iter  20 value 38.266584
## iter  30 value 31.949258
## iter  40 value 26.171720
## iter  50 value 24.692719
## iter  60 value 23.578569
## iter  70 value 22.413985
## iter  80 value 21.573683
## iter  90 value 21.262099
## iter 100 value 21.049293
## final  value 21.049293 
## stopped after 100 iterations
## # weights:  19
## initial  value 454.888176 
## iter  10 value 143.168823
## iter  20 value 95.125947
## iter  30 value 70.421425
## iter  40 value 65.791573
## iter  50 value 65.477218
## iter  60 value 65.450502
## iter  60 value 65.450502
## iter  60 value 65.450502
## final  value 65.450502 
## converged
## # weights:  55
## initial  value 314.293858 
## iter  10 value 74.105026
## iter  20 value 59.284915
## iter  30 value 55.282031
## iter  40 value 53.925213
## iter  50 value 53.837061
## iter  60 value 53.792521
## final  value 53.789968 
## converged
## # weights:  91
## initial  value 263.706979 
## iter  10 value 67.160810
## iter  20 value 59.099429
## iter  30 value 56.351979
## iter  40 value 54.507413
## iter  50 value 52.568696
## iter  60 value 51.858068
## iter  70 value 50.047770
## iter  80 value 49.625949
## iter  90 value 49.549901
## iter 100 value 49.543079
## final  value 49.543079 
## stopped after 100 iterations
## # weights:  19
## initial  value 254.912422 
## iter  10 value 79.589087
## iter  20 value 71.400900
## iter  30 value 69.494194
## iter  40 value 67.641443
## iter  50 value 67.450935
## iter  60 value 66.842660
## iter  70 value 66.774068
## iter  80 value 66.745096
## iter  90 value 66.736458
## iter 100 value 66.732094
## final  value 66.732094 
## stopped after 100 iterations
## # weights:  55
## initial  value 337.913969 
## iter  10 value 57.289622
## iter  20 value 49.737109
## iter  30 value 41.948278
## iter  40 value 40.554765
## iter  50 value 39.495482
## iter  60 value 38.919231
## iter  70 value 38.799059
## iter  80 value 38.707654
## iter  90 value 38.625796
## iter 100 value 38.506632
## final  value 38.506632 
## stopped after 100 iterations
## # weights:  91
## initial  value 257.551167 
## iter  10 value 49.140909
## iter  20 value 28.834240
## iter  30 value 20.998800
## iter  40 value 19.054485
## iter  50 value 18.171724
## iter  60 value 17.761349
## iter  70 value 17.158932
## iter  80 value 16.857366
## iter  90 value 16.232216
## iter 100 value 15.531906
## final  value 15.531906 
## stopped after 100 iterations
## # weights:  19
## initial  value 296.894633 
## iter  10 value 72.145730
## iter  20 value 57.445170
## iter  30 value 53.597687
## iter  40 value 52.470145
## iter  50 value 52.263771
## iter  60 value 52.224858
## final  value 52.224551 
## converged
## # weights:  55
## initial  value 322.553240 
## iter  10 value 77.265297
## iter  20 value 49.427576
## iter  30 value 42.732137
## iter  40 value 40.321243
## iter  50 value 37.445246
## iter  60 value 36.073924
## iter  70 value 34.645827
## iter  80 value 34.326359
## iter  90 value 34.217720
## iter 100 value 34.210187
## final  value 34.210187 
## stopped after 100 iterations
## # weights:  91
## initial  value 233.977796 
## iter  10 value 51.028490
## iter  20 value 29.087640
## iter  30 value 18.362564
## iter  40 value 14.747256
## iter  50 value 13.645102
## iter  60 value 13.090165
## iter  70 value 12.740968
## iter  80 value 12.548769
## iter  90 value 12.481595
## iter 100 value 12.464205
## final  value 12.464205 
## stopped after 100 iterations
## # weights:  19
## initial  value 317.873283 
## iter  10 value 79.182446
## iter  20 value 64.277511
## iter  30 value 61.190881
## iter  40 value 60.986496
## iter  50 value 60.986217
## final  value 60.986213 
## converged
## # weights:  55
## initial  value 169.456833 
## iter  10 value 66.391087
## iter  20 value 55.922990
## iter  30 value 49.698025
## iter  40 value 47.693081
## iter  50 value 47.416724
## iter  60 value 47.346937
## iter  70 value 47.345345
## final  value 47.345342 
## converged
## # weights:  91
## initial  value 191.442627 
## iter  10 value 62.898916
## iter  20 value 51.916094
## iter  30 value 47.555118
## iter  40 value 45.593482
## iter  50 value 44.641927
## iter  60 value 43.521118
## iter  70 value 43.088196
## iter  80 value 42.999918
## iter  90 value 42.983079
## iter 100 value 42.982596
## final  value 42.982596 
## stopped after 100 iterations
## # weights:  19
## initial  value 206.721059 
## iter  10 value 61.447995
## iter  20 value 55.202361
## iter  30 value 53.405659
## iter  40 value 52.279468
## iter  50 value 52.268785
## iter  60 value 52.254770
## final  value 52.242586 
## converged
## # weights:  55
## initial  value 196.645256 
## iter  10 value 75.101253
## iter  20 value 40.426783
## iter  30 value 33.746053
## iter  40 value 32.291478
## iter  50 value 31.977401
## iter  60 value 31.841583
## iter  70 value 31.768137
## iter  80 value 31.709328
## iter  90 value 31.664937
## iter 100 value 31.635661
## final  value 31.635661 
## stopped after 100 iterations
## # weights:  91
## initial  value 315.302980 
## iter  10 value 86.538388
## iter  20 value 40.038333
## iter  30 value 23.670420
## iter  40 value 14.354256
## iter  50 value 9.419611
## iter  60 value 8.552835
## iter  70 value 8.504782
## iter  80 value 8.423477
## iter  90 value 8.377695
## iter 100 value 8.333636
## final  value 8.333636 
## stopped after 100 iterations
## # weights:  19
## initial  value 283.060022 
## iter  10 value 83.888841
## iter  20 value 51.176309
## iter  30 value 44.073401
## iter  40 value 39.690609
## iter  50 value 38.015091
## iter  60 value 35.706857
## iter  70 value 34.523620
## iter  80 value 34.345722
## iter  90 value 34.336098
## iter 100 value 34.329880
## final  value 34.329880 
## stopped after 100 iterations
## # weights:  55
## initial  value 284.801277 
## iter  10 value 57.317377
## iter  20 value 38.626163
## iter  30 value 28.417755
## iter  40 value 27.392488
## iter  50 value 26.958732
## iter  60 value 26.841348
## iter  70 value 26.763140
## iter  80 value 26.749483
## iter  90 value 26.719329
## iter 100 value 26.669842
## final  value 26.669842 
## stopped after 100 iterations
## # weights:  91
## initial  value 361.428730 
## iter  10 value 44.946753
## iter  20 value 29.243480
## iter  30 value 21.693530
## iter  40 value 19.139937
## iter  50 value 17.686367
## iter  60 value 17.203408
## iter  70 value 16.978446
## iter  80 value 16.299943
## iter  90 value 15.507121
## iter 100 value 13.740452
## final  value 13.740452 
## stopped after 100 iterations
## # weights:  19
## initial  value 252.767875 
## iter  10 value 92.193205
## iter  20 value 63.764333
## iter  30 value 56.956642
## iter  40 value 55.586281
## iter  50 value 55.380958
## final  value 55.367261 
## converged
## # weights:  55
## initial  value 308.685555 
## iter  10 value 62.803306
## iter  20 value 54.108569
## iter  30 value 51.288241
## iter  40 value 51.155347
## iter  50 value 50.953108
## iter  60 value 50.678570
## iter  70 value 50.441104
## iter  80 value 50.425281
## final  value 50.425275 
## converged
## # weights:  91
## initial  value 532.481240 
## iter  10 value 71.995509
## iter  20 value 51.777498
## iter  30 value 48.717216
## iter  40 value 46.639764
## iter  50 value 45.730042
## iter  60 value 43.630254
## iter  70 value 43.053210
## iter  80 value 43.036417
## iter  90 value 43.035198
## final  value 43.035191 
## converged
## # weights:  19
## initial  value 261.700757 
## iter  10 value 63.060222
## iter  20 value 46.934222
## iter  30 value 44.755980
## iter  40 value 42.708972
## iter  50 value 42.140828
## iter  60 value 41.628658
## iter  70 value 41.514030
## iter  80 value 41.491299
## iter  90 value 41.481058
## iter 100 value 41.476349
## final  value 41.476349 
## stopped after 100 iterations
## # weights:  55
## initial  value 413.121606 
## iter  10 value 61.780225
## iter  20 value 39.037916
## iter  30 value 33.858658
## iter  40 value 32.545975
## iter  50 value 31.188283
## iter  60 value 28.666246
## iter  70 value 28.326579
## iter  80 value 28.022920
## iter  90 value 27.891335
## iter 100 value 27.767959
## final  value 27.767959 
## stopped after 100 iterations
## # weights:  91
## initial  value 362.558467 
## iter  10 value 40.942861
## iter  20 value 26.951296
## iter  30 value 17.236100
## iter  40 value 14.268325
## iter  50 value 13.801461
## iter  60 value 13.526178
## iter  70 value 13.211304
## iter  80 value 12.907852
## iter  90 value 12.789617
## iter 100 value 12.731299
## final  value 12.731299 
## stopped after 100 iterations
## # weights:  19
## initial  value 262.645663 
## iter  10 value 83.499982
## iter  20 value 72.104354
## iter  30 value 67.078339
## iter  40 value 63.278018
## iter  50 value 61.930370
## iter  60 value 61.511909
## iter  70 value 60.839712
## iter  80 value 60.747543
## final  value 60.747037 
## converged
## # weights:  55
## initial  value 238.575648 
## iter  10 value 66.363015
## iter  20 value 53.677529
## iter  30 value 43.068350
## iter  40 value 38.368414
## iter  50 value 32.845501
## iter  60 value 29.051143
## iter  70 value 26.054765
## iter  80 value 25.125845
## iter  90 value 24.879074
## iter 100 value 24.797444
## final  value 24.797444 
## stopped after 100 iterations
## # weights:  91
## initial  value 221.937836 
## iter  10 value 78.882210
## iter  20 value 41.831953
## iter  30 value 34.140767
## iter  40 value 30.502203
## iter  50 value 29.660881
## iter  60 value 27.509184
## iter  70 value 25.477271
## iter  80 value 24.287569
## iter  90 value 22.475288
## iter 100 value 21.726551
## final  value 21.726551 
## stopped after 100 iterations
## # weights:  19
## initial  value 367.274281 
## iter  10 value 81.348105
## iter  20 value 68.967467
## iter  30 value 66.380004
## iter  40 value 66.197933
## final  value 66.197208 
## converged
## # weights:  55
## initial  value 387.095429 
## iter  10 value 66.194703
## iter  20 value 54.885813
## iter  30 value 53.155746
## iter  40 value 52.923618
## iter  50 value 52.837270
## iter  60 value 52.836926
## final  value 52.836926 
## converged
## # weights:  91
## initial  value 300.234581 
## iter  10 value 66.378804
## iter  20 value 55.312231
## iter  30 value 52.947860
## iter  40 value 51.981248
## iter  50 value 50.133835
## iter  60 value 49.282931
## iter  70 value 48.341092
## iter  80 value 48.245254
## iter  90 value 48.239979
## final  value 48.239952 
## converged
## # weights:  19
## initial  value 245.307419 
## iter  10 value 73.807998
## iter  20 value 60.289685
## iter  30 value 58.802311
## iter  40 value 56.827050
## iter  50 value 56.638554
## iter  60 value 56.377202
## iter  70 value 56.254515
## iter  80 value 56.223147
## iter  90 value 56.209954
## iter 100 value 56.198503
## final  value 56.198503 
## stopped after 100 iterations
## # weights:  55
## initial  value 155.885606 
## iter  10 value 59.440142
## iter  20 value 43.045686
## iter  30 value 39.924918
## iter  40 value 38.255238
## iter  50 value 38.029980
## iter  60 value 37.448862
## iter  70 value 37.359854
## iter  80 value 37.300948
## iter  90 value 36.913224
## iter 100 value 36.816669
## final  value 36.816669 
## stopped after 100 iterations
## # weights:  91
## initial  value 252.535339 
## iter  10 value 77.874456
## iter  20 value 51.870120
## iter  30 value 44.533985
## iter  40 value 41.447902
## iter  50 value 40.068225
## iter  60 value 37.059015
## iter  70 value 33.424977
## iter  80 value 31.647974
## iter  90 value 29.775956
## iter 100 value 26.832529
## final  value 26.832529 
## stopped after 100 iterations
## # weights:  19
## initial  value 315.535566 
## iter  10 value 69.367059
## iter  20 value 50.490764
## iter  30 value 46.129352
## iter  40 value 44.730976
## final  value 44.727671 
## converged
## # weights:  55
## initial  value 391.353348 
## iter  10 value 44.906271
## iter  20 value 28.799805
## iter  30 value 24.655788
## iter  40 value 23.691555
## iter  50 value 23.030399
## iter  60 value 22.185481
## iter  70 value 21.346111
## iter  80 value 21.100584
## iter  90 value 21.086203
## iter 100 value 21.084832
## final  value 21.084832 
## stopped after 100 iterations
## # weights:  91
## initial  value 202.513592 
## iter  10 value 49.174985
## iter  20 value 30.116655
## iter  30 value 18.681129
## iter  40 value 12.261379
## iter  50 value 9.599056
## iter  60 value 8.914015
## iter  70 value 8.876108
## iter  80 value 8.872280
## iter  90 value 8.871567
## iter 100 value 8.871003
## final  value 8.871003 
## stopped after 100 iterations
## # weights:  19
## initial  value 329.935941 
## iter  10 value 77.805107
## iter  20 value 63.538759
## iter  30 value 60.551581
## iter  40 value 60.076040
## iter  50 value 60.075805
## final  value 60.075798 
## converged
## # weights:  55
## initial  value 230.044660 
## iter  10 value 68.775586
## iter  20 value 55.704319
## iter  30 value 50.997603
## iter  40 value 50.328240
## iter  50 value 50.135247
## iter  60 value 50.129211
## final  value 50.129144 
## converged
## # weights:  91
## initial  value 293.915867 
## iter  10 value 63.326628
## iter  20 value 52.409662
## iter  30 value 47.545945
## iter  40 value 44.279315
## iter  50 value 43.448033
## iter  60 value 43.045363
## iter  70 value 42.918779
## iter  80 value 42.869175
## iter  90 value 42.868394
## iter 100 value 42.854086
## final  value 42.854086 
## stopped after 100 iterations
## # weights:  19
## initial  value 264.894498 
## iter  10 value 63.496258
## iter  20 value 51.972042
## iter  30 value 50.893540
## iter  40 value 48.523309
## iter  50 value 48.382398
## iter  60 value 48.327091
## iter  70 value 48.305216
## iter  80 value 48.296451
## iter  90 value 48.293194
## iter 100 value 48.288001
## final  value 48.288001 
## stopped after 100 iterations
## # weights:  55
## initial  value 427.545568 
## iter  10 value 51.952735
## iter  20 value 44.846605
## iter  30 value 37.628351
## iter  40 value 36.210680
## iter  50 value 35.289304
## iter  60 value 34.709799
## iter  70 value 34.493352
## iter  80 value 34.294622
## iter  90 value 34.219747
## iter 100 value 34.159603
## final  value 34.159603 
## stopped after 100 iterations
## # weights:  91
## initial  value 176.480150 
## iter  10 value 46.377184
## iter  20 value 30.528825
## iter  30 value 23.697477
## iter  40 value 21.179646
## iter  50 value 20.747525
## iter  60 value 20.456093
## iter  70 value 20.212684
## iter  80 value 20.117760
## iter  90 value 20.050517
## iter 100 value 20.018014
## final  value 20.018014 
## stopped after 100 iterations
## # weights:  55
## initial  value 270.015273 
## iter  10 value 69.506896
## iter  20 value 61.745338
## iter  30 value 59.802579
## iter  40 value 59.127278
## iter  50 value 58.962751
## iter  60 value 58.950862
## final  value 58.950846 
## converged
plot(modell_nn4)

Das beste Modell ergibt sich bei einem Hiddenlayer und einem decay weight von 0.1. So wurde eine Trainingsaccuracy von 93% erreicht. Auffallend ist zudem, dass bei einem decay weight von 0.1, es gar keine Rolle spielt, wie viel Hiddenlayer es gibt, da die Accuracy stets bei ca. 93% liegt. Die Modelle mit nur leicht veränderten weights, schließen etwas schlechter ab, mit einer Trainingsaccuracy von knapp über 89% bis 92%.

modell_nn4_best <- modell_nn4$bestTune
modell_nn4_best
##   size decay
## 6    3   0.1

Das beste Modell entsteht mit 5 Hidden Units und einem Weights decay von 0,1.

predict_testNN_4 = predict(modell_nn4, data_test)
#predict_testNN_4 <-sapply(predict_testNN_4,round,digits=0)
nn_table4 <- table(data_test$target, predict_testNN_4)

Auch auf den Testdaten perfomt das NN mit der Caret Funktion besser als die beiden Modelle mit der nnet-Funktion. Die Testaccuracy liegt bei über 91 %, der Recall bei 60%. Aber es wurden 5 Patienten fälschlicherweise als gesund gemeldet, obwohl Sie mit Corona infiziert sind. Dies wollen wir unebdingt vermeiden.

results_nn4 <- data.frame(actual = data_test$target, prediction = predict_testNN_4)
conf_nn4 <- confusionMatrix(nn_table4)
conf_nn4
## Confusion Matrix and Statistics
## 
##    predict_testNN_4
##      0  1
##   0 90  4
##   1  5  6
##                                           
##                Accuracy : 0.9143          
##                  95% CI : (0.8435, 0.9601)
##     No Information Rate : 0.9048          
##     P-Value [Acc > NIR] : 0.4516          
##                                           
##                   Kappa : 0.5239          
##                                           
##  Mcnemar's Test P-Value : 1.0000          
##                                           
##             Sensitivity : 0.9474          
##             Specificity : 0.6000          
##          Pos Pred Value : 0.9574          
##          Neg Pred Value : 0.5455          
##              Prevalence : 0.9048          
##          Detection Rate : 0.8571          
##    Detection Prevalence : 0.8952          
##       Balanced Accuracy : 0.7737          
##                                           
##        'Positive' Class : 0               
## 
acc_nn4 <- conf_nn4$overall[1]
sens_nn4 <- conf_nn4$byClass[1]
spec_nn4 <- conf_nn4$byClass[2]

Nun trainieren wir noch ein weiteres Neuronales Netz mit der Train Function, nun werden wir als Trainingsdatensatz aber einen upsampleten Datensatz, mit balancierter Responsevariable verwenden und die Daten zu vor Dummy Encodieren. Wir sind gespannt, wie es sich im Vergleich zum Modell mit dem “normalen” Trainingsdaten verhält.

set.seed(1910837262)
up_trainset_nn <- upSample(x = data_train[, -ncol(data_train)],
                     y = as.factor(data_train$target))                         
table(up_trainset_nn$target) 
## 
##   0   1 
## 380 380
up_trainset_nn <- up_trainset_nn %>%
    select(-Class)
for (f in ohe_feats){
  df_all_dummy = acm.disjonctif(up_trainset_nn[f])
  up_trainset_nn[f] = NULL
  up_trainset_nn = cbind(up_trainset_nn, df_all_dummy)
}
testset_nn <- data_test
for (f in ohe_feats){
  df_all_dummy = acm.disjonctif(testset_nn[f])
  testset_nn[f] = NULL
  testset_nn = cbind(testset_nn, df_all_dummy)
}
#Caret Modell mit upsamplet Trainingdsatensatz:
modell_nn5 <- train(up_trainset_nn[,-2], up_trainset_nn$target,
                  method = "nnet",
                  trControl= TrainingParameters_nn,
                  preProcess=c("scale","center")
)
## # weights:  22
## initial  value 493.250696 
## iter  10 value 261.647044
## iter  20 value 207.231245
## iter  30 value 201.548351
## iter  40 value 199.725889
## iter  50 value 199.555556
## iter  60 value 199.549818
## iter  70 value 199.539122
## iter  80 value 199.535649
## final  value 199.535645 
## converged
## # weights:  64
## initial  value 494.655937 
## iter  10 value 185.923247
## iter  20 value 157.355340
## iter  30 value 142.044914
## iter  40 value 132.734099
## iter  50 value 130.063650
## iter  60 value 130.008440
## final  value 130.008371 
## converged
## # weights:  106
## initial  value 618.232939 
## iter  10 value 186.273563
## iter  20 value 126.151469
## iter  30 value 109.831653
## iter  40 value 104.990130
## iter  50 value 104.007064
## iter  60 value 104.000064
## iter  70 value 103.999040
## final  value 103.998927 
## converged
## # weights:  22
## initial  value 486.101694 
## iter  10 value 271.907204
## iter  20 value 216.205880
## iter  30 value 207.213205
## iter  40 value 206.736992
## final  value 206.736292 
## converged
## # weights:  64
## initial  value 477.362750 
## iter  10 value 220.828987
## iter  20 value 207.362127
## iter  30 value 180.154796
## iter  40 value 167.337717
## iter  50 value 154.062601
## iter  60 value 133.102227
## iter  70 value 129.891245
## iter  80 value 127.371284
## iter  90 value 126.864525
## iter 100 value 126.731737
## final  value 126.731737 
## stopped after 100 iterations
## # weights:  106
## initial  value 521.986902 
## iter  10 value 243.731071
## iter  20 value 183.154071
## iter  30 value 155.876847
## iter  40 value 141.660653
## iter  50 value 134.570960
## iter  60 value 126.757897
## iter  70 value 121.606665
## iter  80 value 119.674971
## iter  90 value 118.831436
## iter 100 value 118.344860
## final  value 118.344860 
## stopped after 100 iterations
## # weights:  22
## initial  value 499.813766 
## iter  10 value 269.859944
## iter  20 value 205.227041
## iter  30 value 197.208419
## iter  40 value 196.054608
## iter  50 value 193.359043
## iter  60 value 192.531550
## iter  70 value 192.469719
## iter  80 value 192.454492
## iter  90 value 192.450671
## iter 100 value 192.450460
## final  value 192.450460 
## stopped after 100 iterations
## # weights:  64
## initial  value 537.491108 
## iter  10 value 181.376563
## iter  20 value 147.707208
## iter  30 value 125.790749
## iter  40 value 118.360951
## iter  50 value 115.975205
## iter  60 value 113.136751
## iter  70 value 110.788224
## iter  80 value 109.411034
## iter  90 value 108.965255
## iter 100 value 108.593799
## final  value 108.593799 
## stopped after 100 iterations
## # weights:  106
## initial  value 568.808717 
## iter  10 value 194.498413
## iter  20 value 117.293496
## iter  30 value 90.964483
## iter  40 value 78.694259
## iter  50 value 67.814226
## iter  60 value 65.341068
## iter  70 value 63.667616
## iter  80 value 62.021714
## iter  90 value 61.319742
## iter 100 value 60.815903
## final  value 60.815903 
## stopped after 100 iterations
## # weights:  22
## initial  value 506.076053 
## iter  10 value 293.458468
## iter  20 value 239.281743
## iter  30 value 221.098826
## iter  40 value 220.209462
## iter  50 value 218.996601
## iter  60 value 218.505202
## iter  70 value 217.681099
## iter  80 value 217.261147
## final  value 217.257721 
## converged
## # weights:  64
## initial  value 479.966921 
## iter  10 value 187.112629
## iter  20 value 146.611734
## iter  30 value 119.928953
## iter  40 value 111.850914
## iter  50 value 108.439536
## iter  60 value 107.461300
## iter  70 value 107.353511
## iter  80 value 107.350098
## iter  90 value 107.349921
## final  value 107.349904 
## converged
## # weights:  106
## initial  value 487.285719 
## iter  10 value 194.230833
## iter  20 value 129.810540
## iter  30 value 97.896225
## iter  40 value 78.718940
## iter  50 value 66.894285
## iter  60 value 56.322598
## iter  70 value 49.063578
## iter  80 value 47.878884
## iter  90 value 46.970191
## iter 100 value 46.718822
## final  value 46.718822 
## stopped after 100 iterations
## # weights:  22
## initial  value 497.015109 
## iter  10 value 267.456264
## iter  20 value 230.718261
## iter  30 value 224.832562
## iter  40 value 223.566927
## final  value 223.556966 
## converged
## # weights:  64
## initial  value 464.198157 
## iter  10 value 202.596767
## iter  20 value 183.912449
## iter  30 value 172.839912
## iter  40 value 169.578847
## iter  50 value 166.610716
## iter  60 value 165.237420
## iter  70 value 164.123359
## iter  80 value 164.075438
## final  value 164.075264 
## converged
## # weights:  106
## initial  value 522.195768 
## iter  10 value 232.716206
## iter  20 value 189.309805
## iter  30 value 149.988535
## iter  40 value 133.252242
## iter  50 value 125.150684
## iter  60 value 121.103424
## iter  70 value 117.811037
## iter  80 value 115.985359
## iter  90 value 115.132381
## iter 100 value 114.647231
## final  value 114.647231 
## stopped after 100 iterations
## # weights:  22
## initial  value 487.935365 
## iter  10 value 261.933517
## iter  20 value 220.855645
## iter  30 value 214.960230
## iter  40 value 209.865108
## iter  50 value 209.299737
## iter  60 value 209.009950
## iter  70 value 207.533366
## iter  80 value 207.356557
## iter  90 value 207.295256
## iter 100 value 207.204601
## final  value 207.204601 
## stopped after 100 iterations
## # weights:  64
## initial  value 512.616609 
## iter  10 value 212.073007
## iter  20 value 161.846817
## iter  30 value 142.740344
## iter  40 value 135.468559
## iter  50 value 128.934725
## iter  60 value 127.611404
## iter  70 value 127.254486
## iter  80 value 126.492869
## iter  90 value 125.935272
## iter 100 value 125.517915
## final  value 125.517915 
## stopped after 100 iterations
## # weights:  106
## initial  value 489.190572 
## iter  10 value 195.239424
## iter  20 value 140.770053
## iter  30 value 98.235881
## iter  40 value 80.820329
## iter  50 value 77.898583
## iter  60 value 75.335549
## iter  70 value 74.899541
## iter  80 value 74.624690
## iter  90 value 69.556765
## iter 100 value 67.327042
## final  value 67.327042 
## stopped after 100 iterations
## # weights:  22
## initial  value 500.894604 
## iter  10 value 239.074952
## iter  20 value 207.098435
## iter  30 value 203.945194
## iter  40 value 203.699530
## iter  50 value 203.681553
## final  value 203.681504 
## converged
## # weights:  64
## initial  value 484.299959 
## iter  10 value 188.475848
## iter  20 value 149.539878
## iter  30 value 129.154516
## iter  40 value 122.240683
## iter  50 value 120.138977
## iter  60 value 119.860344
## iter  70 value 119.771342
## iter  80 value 119.604807
## iter  90 value 119.600916
## final  value 119.600878 
## converged
## # weights:  106
## initial  value 521.003756 
## iter  10 value 187.711872
## iter  20 value 136.719411
## iter  30 value 114.330956
## iter  40 value 103.521370
## iter  50 value 94.829828
## iter  60 value 92.042201
## iter  70 value 91.459898
## iter  80 value 91.136035
## iter  90 value 90.956600
## iter 100 value 90.860097
## final  value 90.860097 
## stopped after 100 iterations
## # weights:  22
## initial  value 484.762682 
## iter  10 value 241.663402
## iter  20 value 228.631823
## iter  30 value 216.569097
## iter  40 value 210.495670
## iter  50 value 210.343248
## iter  60 value 210.342421
## final  value 210.342359 
## converged
## # weights:  64
## initial  value 474.932715 
## iter  10 value 268.527652
## iter  20 value 195.727091
## iter  30 value 176.478247
## iter  40 value 163.286264
## iter  50 value 158.020244
## iter  60 value 155.521383
## iter  70 value 154.302528
## iter  80 value 154.144604
## iter  90 value 154.140696
## iter 100 value 154.132262
## final  value 154.132262 
## stopped after 100 iterations
## # weights:  106
## initial  value 493.287996 
## iter  10 value 198.261732
## iter  20 value 159.700190
## iter  30 value 140.841161
## iter  40 value 130.112202
## iter  50 value 124.472536
## iter  60 value 122.496786
## iter  70 value 120.861351
## iter  80 value 118.394076
## iter  90 value 114.233775
## iter 100 value 112.762639
## final  value 112.762639 
## stopped after 100 iterations
## # weights:  22
## initial  value 481.384152 
## iter  10 value 276.034183
## iter  20 value 213.926075
## iter  30 value 204.676772
## iter  40 value 204.556915
## iter  50 value 204.517481
## iter  60 value 204.501071
## iter  70 value 204.480030
## iter  70 value 204.480028
## iter  70 value 204.480028
## final  value 204.480028 
## converged
## # weights:  64
## initial  value 485.773315 
## iter  10 value 182.345405
## iter  20 value 138.085683
## iter  30 value 122.195990
## iter  40 value 116.137478
## iter  50 value 115.381193
## iter  60 value 115.068872
## iter  70 value 114.868972
## iter  80 value 114.684889
## iter  90 value 114.405134
## iter 100 value 114.291189
## final  value 114.291189 
## stopped after 100 iterations
## # weights:  106
## initial  value 599.135359 
## iter  10 value 173.622545
## iter  20 value 111.713362
## iter  30 value 78.432991
## iter  40 value 67.596249
## iter  50 value 60.777746
## iter  60 value 57.620806
## iter  70 value 55.846580
## iter  80 value 55.459143
## iter  90 value 55.220997
## iter 100 value 55.065950
## final  value 55.065950 
## stopped after 100 iterations
## # weights:  22
## initial  value 527.826128 
## iter  10 value 283.540536
## iter  20 value 229.137257
## iter  30 value 222.464152
## iter  40 value 214.759629
## iter  50 value 214.071233
## iter  60 value 214.067850
## iter  70 value 214.065466
## iter  80 value 214.065079
## final  value 214.064749 
## converged
## # weights:  64
## initial  value 475.814788 
## iter  10 value 224.880875
## iter  20 value 177.458752
## iter  30 value 156.434289
## iter  40 value 139.781924
## iter  50 value 135.685051
## iter  60 value 129.778596
## iter  70 value 126.762204
## iter  80 value 126.543241
## iter  90 value 126.452245
## iter 100 value 126.444789
## final  value 126.444789 
## stopped after 100 iterations
## # weights:  106
## initial  value 488.218610 
## iter  10 value 199.001926
## iter  20 value 158.770537
## iter  30 value 108.911611
## iter  40 value 90.648199
## iter  50 value 65.648496
## iter  60 value 50.673020
## iter  70 value 44.898738
## iter  80 value 41.177145
## iter  90 value 38.335727
## iter 100 value 36.212441
## final  value 36.212441 
## stopped after 100 iterations
## # weights:  22
## initial  value 472.079367 
## iter  10 value 293.639905
## iter  20 value 236.754359
## iter  30 value 221.819838
## iter  40 value 219.992009
## final  value 219.980018 
## converged
## # weights:  64
## initial  value 553.360171 
## iter  10 value 204.133295
## iter  20 value 171.660669
## iter  30 value 163.624920
## iter  40 value 158.689244
## iter  50 value 156.287390
## iter  60 value 155.175508
## iter  70 value 155.076551
## iter  80 value 155.070223
## final  value 155.070169 
## converged
## # weights:  106
## initial  value 502.942670 
## iter  10 value 199.341727
## iter  20 value 157.916575
## iter  30 value 140.815669
## iter  40 value 133.661660
## iter  50 value 129.344405
## iter  60 value 125.452794
## iter  70 value 124.352318
## iter  80 value 123.396002
## iter  90 value 123.175094
## iter 100 value 123.152156
## final  value 123.152156 
## stopped after 100 iterations
## # weights:  22
## initial  value 457.622360 
## iter  10 value 290.283561
## iter  20 value 220.197162
## iter  30 value 208.640633
## iter  40 value 204.713790
## iter  50 value 202.470318
## iter  60 value 202.202921
## iter  70 value 202.187062
## iter  80 value 202.180533
## iter  90 value 202.173077
## iter 100 value 202.171962
## final  value 202.171962 
## stopped after 100 iterations
## # weights:  64
## initial  value 478.119361 
## iter  10 value 201.199462
## iter  20 value 159.690269
## iter  30 value 141.041601
## iter  40 value 127.661395
## iter  50 value 124.593887
## iter  60 value 122.936259
## iter  70 value 122.373508
## iter  80 value 122.317488
## iter  90 value 122.258359
## iter 100 value 122.200667
## final  value 122.200667 
## stopped after 100 iterations
## # weights:  106
## initial  value 495.168867 
## iter  10 value 184.374813
## iter  20 value 119.940050
## iter  30 value 81.458157
## iter  40 value 56.602053
## iter  50 value 51.182952
## iter  60 value 45.736550
## iter  70 value 43.600108
## iter  80 value 42.468799
## iter  90 value 42.035533
## iter 100 value 41.162377
## final  value 41.162377 
## stopped after 100 iterations
## # weights:  22
## initial  value 502.667674 
## iter  10 value 208.836573
## iter  20 value 200.346606
## iter  30 value 195.939692
## iter  40 value 192.279084
## iter  50 value 189.778232
## iter  60 value 189.766882
## final  value 189.766802 
## converged
## # weights:  64
## initial  value 493.030021 
## iter  10 value 223.886905
## iter  20 value 174.302831
## iter  30 value 137.839612
## iter  40 value 122.193055
## iter  50 value 119.615738
## iter  60 value 117.142941
## iter  70 value 115.400674
## iter  80 value 112.659130
## iter  90 value 110.151726
## iter 100 value 109.781726
## final  value 109.781726 
## stopped after 100 iterations
## # weights:  106
## initial  value 445.026036 
## iter  10 value 170.823584
## iter  20 value 125.361823
## iter  30 value 101.785923
## iter  40 value 89.514944
## iter  50 value 78.846046
## iter  60 value 74.043250
## iter  70 value 72.677529
## iter  80 value 72.609824
## iter  90 value 72.607706
## final  value 72.607697 
## converged
## # weights:  22
## initial  value 474.443366 
## iter  10 value 214.744781
## iter  20 value 207.935595
## iter  30 value 207.636157
## final  value 207.636129 
## converged
## # weights:  64
## initial  value 461.706822 
## iter  10 value 202.739386
## iter  20 value 176.048635
## iter  30 value 171.355899
## iter  40 value 169.507998
## iter  50 value 168.142605
## iter  60 value 167.985764
## iter  70 value 167.976147
## final  value 167.976102 
## converged
## # weights:  106
## initial  value 546.126032 
## iter  10 value 188.562222
## iter  20 value 150.666506
## iter  30 value 129.026099
## iter  40 value 117.601497
## iter  50 value 113.181931
## iter  60 value 110.486812
## iter  70 value 107.464842
## iter  80 value 105.054630
## iter  90 value 103.692453
## iter 100 value 103.412434
## final  value 103.412434 
## stopped after 100 iterations
## # weights:  22
## initial  value 455.501277 
## iter  10 value 214.191773
## iter  20 value 205.879202
## iter  30 value 202.860037
## iter  40 value 200.613854
## iter  50 value 200.573906
## iter  60 value 200.572468
## iter  70 value 200.567618
## final  value 200.567607 
## converged
## # weights:  64
## initial  value 433.546062 
## iter  10 value 251.499049
## iter  20 value 217.810536
## iter  30 value 190.891446
## iter  40 value 168.629177
## iter  50 value 148.744569
## iter  60 value 131.583181
## iter  70 value 129.056732
## iter  80 value 126.735649
## iter  90 value 123.894470
## iter 100 value 122.780144
## final  value 122.780144 
## stopped after 100 iterations
## # weights:  106
## initial  value 460.349657 
## iter  10 value 178.240460
## iter  20 value 121.886572
## iter  30 value 108.232740
## iter  40 value 98.167257
## iter  50 value 92.193318
## iter  60 value 90.813202
## iter  70 value 90.326533
## iter  80 value 90.138843
## iter  90 value 89.854712
## iter 100 value 89.258725
## final  value 89.258725 
## stopped after 100 iterations
## # weights:  22
## initial  value 497.942852 
## iter  10 value 244.483997
## iter  20 value 212.801715
## iter  30 value 208.296937
## iter  40 value 205.479980
## iter  50 value 199.011198
## iter  60 value 198.947638
## iter  70 value 198.925121
## iter  80 value 198.894470
## iter  90 value 198.885483
## final  value 198.884950 
## converged
## # weights:  64
## initial  value 492.347645 
## iter  10 value 207.238783
## iter  20 value 172.708162
## iter  30 value 143.748804
## iter  40 value 124.541473
## iter  50 value 101.405032
## iter  60 value 96.063187
## iter  70 value 95.144925
## iter  80 value 94.706162
## iter  90 value 94.393538
## iter 100 value 94.160725
## final  value 94.160725 
## stopped after 100 iterations
## # weights:  106
## initial  value 503.662331 
## iter  10 value 189.491363
## iter  20 value 135.217579
## iter  30 value 114.382465
## iter  40 value 106.696559
## iter  50 value 104.652508
## iter  60 value 103.186564
## iter  70 value 99.278303
## iter  80 value 95.367204
## iter  90 value 94.821899
## iter 100 value 93.986369
## final  value 93.986369 
## stopped after 100 iterations
## # weights:  22
## initial  value 488.271407 
## iter  10 value 251.244646
## iter  20 value 221.361642
## iter  30 value 219.451367
## final  value 219.421050 
## converged
## # weights:  64
## initial  value 482.849339 
## iter  10 value 195.287369
## iter  20 value 171.591981
## iter  30 value 163.208384
## iter  40 value 156.920422
## iter  50 value 148.589871
## iter  60 value 141.005759
## iter  70 value 138.507873
## iter  80 value 137.562405
## iter  90 value 137.550029
## final  value 137.549567 
## converged
## # weights:  106
## initial  value 508.964363 
## iter  10 value 197.221760
## iter  20 value 153.391191
## iter  30 value 140.874922
## iter  40 value 133.269329
## iter  50 value 129.309723
## iter  60 value 127.594878
## iter  70 value 127.204548
## iter  80 value 127.061611
## iter  90 value 127.001774
## iter 100 value 126.945202
## final  value 126.945202 
## stopped after 100 iterations
## # weights:  22
## initial  value 475.495529 
## iter  10 value 300.105265
## iter  20 value 247.171828
## iter  30 value 236.613335
## iter  40 value 223.463220
## iter  50 value 221.129310
## iter  60 value 218.474020
## iter  70 value 218.148818
## final  value 218.148738 
## converged
## # weights:  64
## initial  value 483.071397 
## iter  10 value 192.037804
## iter  20 value 169.939711
## iter  30 value 139.849598
## iter  40 value 124.938579
## iter  50 value 121.608358
## iter  60 value 119.603849
## iter  70 value 118.164425
## iter  80 value 115.816512
## iter  90 value 114.870099
## iter 100 value 114.614717
## final  value 114.614717 
## stopped after 100 iterations
## # weights:  106
## initial  value 513.679373 
## iter  10 value 207.620273
## iter  20 value 159.986945
## iter  30 value 100.541248
## iter  40 value 85.271240
## iter  50 value 76.363491
## iter  60 value 72.688590
## iter  70 value 72.073593
## iter  80 value 71.331773
## iter  90 value 69.754310
## iter 100 value 69.152381
## final  value 69.152381 
## stopped after 100 iterations
## # weights:  22
## initial  value 482.601259 
## iter  10 value 262.725097
## iter  20 value 221.831658
## iter  30 value 208.543236
## iter  40 value 208.423240
## iter  50 value 208.333408
## iter  60 value 208.329510
## iter  70 value 208.314749
## final  value 208.314302 
## converged
## # weights:  64
## initial  value 520.801090 
## iter  10 value 197.795988
## iter  20 value 160.290480
## iter  30 value 149.241291
## iter  40 value 139.346923
## iter  50 value 127.721738
## iter  60 value 116.873078
## iter  70 value 110.601892
## iter  80 value 106.807490
## iter  90 value 103.715181
## iter 100 value 102.575314
## final  value 102.575314 
## stopped after 100 iterations
## # weights:  106
## initial  value 504.017632 
## iter  10 value 225.293592
## iter  20 value 142.427804
## iter  30 value 126.218868
## iter  40 value 119.760895
## iter  50 value 112.271422
## iter  60 value 109.481650
## iter  70 value 105.520666
## iter  80 value 103.833757
## iter  90 value 102.931339
## iter 100 value 101.673253
## final  value 101.673253 
## stopped after 100 iterations
## # weights:  22
## initial  value 472.914877 
## iter  10 value 260.866747
## iter  20 value 227.936098
## iter  30 value 226.148098
## final  value 226.142932 
## converged
## # weights:  64
## initial  value 493.717486 
## iter  10 value 235.129177
## iter  20 value 201.854245
## iter  30 value 174.342030
## iter  40 value 163.690875
## iter  50 value 158.405138
## iter  60 value 157.067754
## iter  70 value 156.755708
## iter  80 value 156.752223
## final  value 156.752197 
## converged
## # weights:  106
## initial  value 462.132473 
## iter  10 value 203.614554
## iter  20 value 168.506311
## iter  30 value 149.275431
## iter  40 value 136.741911
## iter  50 value 132.865040
## iter  60 value 127.046393
## iter  70 value 125.222114
## iter  80 value 123.390444
## iter  90 value 121.768152
## iter 100 value 120.350631
## final  value 120.350631 
## stopped after 100 iterations
## # weights:  22
## initial  value 477.725045 
## iter  10 value 332.323953
## iter  20 value 238.416160
## iter  30 value 226.065089
## iter  40 value 223.371333
## iter  50 value 223.239602
## iter  60 value 223.186187
## iter  70 value 223.176851
## iter  80 value 223.174169
## iter  90 value 223.173527
## iter 100 value 223.171580
## final  value 223.171580 
## stopped after 100 iterations
## # weights:  64
## initial  value 544.726372 
## iter  10 value 216.883726
## iter  20 value 178.906676
## iter  30 value 146.708566
## iter  40 value 137.261898
## iter  50 value 131.654277
## iter  60 value 127.755316
## iter  70 value 122.413199
## iter  80 value 121.810442
## iter  90 value 121.664216
## iter 100 value 121.525437
## final  value 121.525437 
## stopped after 100 iterations
## # weights:  106
## initial  value 524.890677 
## iter  10 value 230.736882
## iter  20 value 154.815111
## iter  30 value 127.323943
## iter  40 value 104.619244
## iter  50 value 99.165184
## iter  60 value 87.188356
## iter  70 value 74.255085
## iter  80 value 63.942511
## iter  90 value 58.371071
## iter 100 value 57.238133
## final  value 57.238133 
## stopped after 100 iterations
## # weights:  22
## initial  value 517.842480 
## iter  10 value 213.959501
## iter  20 value 206.566352
## iter  30 value 200.328303
## iter  40 value 191.476410
## iter  50 value 191.162000
## iter  60 value 191.160287
## iter  70 value 191.159575
## iter  80 value 191.159066
## iter  90 value 191.158071
## final  value 191.158057 
## converged
## # weights:  64
## initial  value 477.257590 
## iter  10 value 191.062763
## iter  20 value 156.775064
## iter  30 value 107.890187
## iter  40 value 99.501775
## iter  50 value 97.399875
## iter  60 value 96.507247
## iter  70 value 96.248127
## iter  80 value 96.159021
## iter  90 value 96.147748
## iter 100 value 96.143609
## final  value 96.143609 
## stopped after 100 iterations
## # weights:  106
## initial  value 474.932286 
## iter  10 value 171.410968
## iter  20 value 103.123374
## iter  30 value 69.846476
## iter  40 value 53.590904
## iter  50 value 48.727833
## iter  60 value 48.332382
## iter  70 value 48.252608
## iter  80 value 47.642065
## iter  90 value 47.133472
## iter 100 value 46.888295
## final  value 46.888295 
## stopped after 100 iterations
## # weights:  22
## initial  value 493.985557 
## iter  10 value 234.114006
## iter  20 value 220.200961
## iter  30 value 218.574913
## final  value 218.572405 
## converged
## # weights:  64
## initial  value 479.998719 
## iter  10 value 277.874410
## iter  20 value 214.577563
## iter  30 value 173.801014
## iter  40 value 164.242906
## iter  50 value 161.792829
## iter  60 value 160.917004
## iter  70 value 160.190197
## iter  80 value 143.830896
## iter  90 value 142.978101
## iter 100 value 142.896483
## final  value 142.896483 
## stopped after 100 iterations
## # weights:  106
## initial  value 519.619080 
## iter  10 value 189.553982
## iter  20 value 152.406895
## iter  30 value 139.535914
## iter  40 value 131.200443
## iter  50 value 126.416309
## iter  60 value 124.392750
## iter  70 value 123.297123
## iter  80 value 122.777366
## iter  90 value 122.481699
## iter 100 value 122.329251
## final  value 122.329251 
## stopped after 100 iterations
## # weights:  22
## initial  value 499.358501 
## iter  10 value 230.032289
## iter  20 value 206.556718
## iter  30 value 200.207924
## iter  40 value 196.646833
## iter  50 value 192.805068
## iter  60 value 192.741924
## iter  70 value 192.706983
## iter  80 value 192.665183
## iter  90 value 192.599258
## iter 100 value 192.589877
## final  value 192.589877 
## stopped after 100 iterations
## # weights:  64
## initial  value 493.996606 
## iter  10 value 189.959799
## iter  20 value 132.004498
## iter  30 value 116.457401
## iter  40 value 110.362993
## iter  50 value 108.821375
## iter  60 value 108.629210
## iter  70 value 108.399098
## iter  80 value 108.302242
## iter  90 value 107.130601
## iter 100 value 106.769495
## final  value 106.769495 
## stopped after 100 iterations
## # weights:  106
## initial  value 474.732801 
## iter  10 value 184.710814
## iter  20 value 134.934449
## iter  30 value 96.853212
## iter  40 value 63.733234
## iter  50 value 51.462917
## iter  60 value 48.506150
## iter  70 value 47.961837
## iter  80 value 46.046555
## iter  90 value 45.455620
## iter 100 value 45.325008
## final  value 45.325008 
## stopped after 100 iterations
## # weights:  22
## initial  value 534.273849 
## iter  10 value 230.563112
## iter  20 value 211.833016
## iter  30 value 209.455443
## iter  40 value 209.072361
## final  value 209.072300 
## converged
## # weights:  64
## initial  value 516.331455 
## iter  10 value 224.428282
## iter  20 value 185.977951
## iter  30 value 179.768415
## iter  40 value 172.984682
## iter  50 value 165.656378
## iter  60 value 158.627782
## iter  70 value 156.636274
## iter  80 value 156.624728
## final  value 156.624711 
## converged
## # weights:  106
## initial  value 473.199492 
## iter  10 value 156.286133
## iter  20 value 104.055529
## iter  30 value 80.985336
## iter  40 value 73.912804
## iter  50 value 70.605828
## iter  60 value 68.586439
## iter  70 value 62.118555
## iter  80 value 62.007224
## final  value 62.007216 
## converged
## # weights:  22
## initial  value 489.235899 
## iter  10 value 275.712993
## iter  20 value 239.740229
## iter  30 value 229.207485
## iter  40 value 220.130373
## iter  50 value 215.431832
## iter  60 value 215.217464
## final  value 215.211851 
## converged
## # weights:  64
## initial  value 499.544666 
## iter  10 value 252.348886
## iter  20 value 212.773484
## iter  30 value 184.175284
## iter  40 value 168.438998
## iter  50 value 162.619836
## iter  60 value 156.944905
## iter  70 value 156.101736
## iter  80 value 155.549441
## iter  90 value 152.378984
## iter 100 value 151.535550
## final  value 151.535550 
## stopped after 100 iterations
## # weights:  106
## initial  value 456.385258 
## iter  10 value 222.079842
## iter  20 value 177.481892
## iter  30 value 143.757907
## iter  40 value 131.550344
## iter  50 value 123.798966
## iter  60 value 118.626942
## iter  70 value 117.548392
## iter  80 value 116.960458
## iter  90 value 116.527211
## iter 100 value 115.817995
## final  value 115.817995 
## stopped after 100 iterations
## # weights:  22
## initial  value 472.285342 
## iter  10 value 298.005101
## iter  20 value 218.129858
## iter  30 value 207.918215
## iter  40 value 206.065180
## iter  50 value 205.862846
## iter  60 value 204.518224
## iter  70 value 200.883101
## iter  80 value 200.459230
## iter  90 value 200.147505
## iter 100 value 200.128299
## final  value 200.128299 
## stopped after 100 iterations
## # weights:  64
## initial  value 460.890552 
## iter  10 value 208.818823
## iter  20 value 174.824906
## iter  30 value 146.675375
## iter  40 value 129.392215
## iter  50 value 122.526398
## iter  60 value 121.015968
## iter  70 value 120.944094
## iter  80 value 120.831068
## iter  90 value 120.482434
## iter 100 value 119.875204
## final  value 119.875204 
## stopped after 100 iterations
## # weights:  106
## initial  value 492.655200 
## iter  10 value 175.377866
## iter  20 value 124.227324
## iter  30 value 90.655727
## iter  40 value 83.225776
## iter  50 value 79.288396
## iter  60 value 77.373607
## iter  70 value 75.562078
## iter  80 value 74.389147
## iter  90 value 73.531038
## iter 100 value 71.927598
## final  value 71.927598 
## stopped after 100 iterations
## # weights:  22
## initial  value 485.958618 
## iter  10 value 289.033792
## iter  20 value 210.876254
## iter  30 value 202.317137
## iter  40 value 200.387501
## iter  50 value 199.347173
## iter  60 value 194.537440
## iter  70 value 194.468697
## final  value 194.468362 
## converged
## # weights:  64
## initial  value 448.568540 
## iter  10 value 166.255337
## iter  20 value 134.017797
## iter  30 value 121.766852
## iter  40 value 113.648694
## iter  50 value 112.270249
## iter  60 value 111.574278
## iter  70 value 111.526891
## iter  80 value 111.523251
## iter  90 value 111.522683
## iter 100 value 111.522201
## final  value 111.522201 
## stopped after 100 iterations
## # weights:  106
## initial  value 441.121731 
## iter  10 value 184.985045
## iter  20 value 140.507765
## iter  30 value 105.552538
## iter  40 value 83.769909
## iter  50 value 74.757352
## iter  60 value 73.846075
## iter  70 value 73.377060
## iter  80 value 73.288694
## iter  90 value 73.274885
## iter 100 value 73.273464
## final  value 73.273464 
## stopped after 100 iterations
## # weights:  22
## initial  value 508.652887 
## iter  10 value 230.885233
## iter  20 value 211.590407
## iter  30 value 207.892669
## iter  40 value 206.989450
## final  value 206.989430 
## converged
## # weights:  64
## initial  value 493.054407 
## iter  10 value 223.310142
## iter  20 value 202.556027
## iter  30 value 192.859176
## iter  40 value 172.449231
## iter  50 value 157.365620
## iter  60 value 148.482094
## iter  70 value 146.293654
## iter  80 value 145.834359
## iter  90 value 145.826320
## final  value 145.826264 
## converged
## # weights:  106
## initial  value 636.323743 
## iter  10 value 221.578512
## iter  20 value 170.175645
## iter  30 value 158.328390
## iter  40 value 144.871619
## iter  50 value 128.312553
## iter  60 value 122.058350
## iter  70 value 119.045070
## iter  80 value 118.204872
## iter  90 value 118.015104
## iter 100 value 117.991065
## final  value 117.991065 
## stopped after 100 iterations
## # weights:  22
## initial  value 464.787177 
## iter  10 value 232.162492
## iter  20 value 206.867561
## iter  30 value 182.598126
## iter  40 value 179.808255
## iter  50 value 179.724471
## iter  60 value 179.598917
## iter  70 value 179.553491
## iter  80 value 179.528137
## iter  90 value 179.521965
## iter 100 value 179.518964
## final  value 179.518964 
## stopped after 100 iterations
## # weights:  64
## initial  value 528.452950 
## iter  10 value 211.912013
## iter  20 value 191.790034
## iter  30 value 148.452473
## iter  40 value 120.543755
## iter  50 value 95.388140
## iter  60 value 85.463612
## iter  70 value 84.240367
## iter  80 value 81.691807
## iter  90 value 80.871778
## iter 100 value 80.531458
## final  value 80.531458 
## stopped after 100 iterations
## # weights:  106
## initial  value 655.193072 
## iter  10 value 181.567649
## iter  20 value 132.436710
## iter  30 value 100.170427
## iter  40 value 83.708299
## iter  50 value 76.381914
## iter  60 value 72.011355
## iter  70 value 69.385398
## iter  80 value 67.522182
## iter  90 value 66.889163
## iter 100 value 66.336170
## final  value 66.336170 
## stopped after 100 iterations
## # weights:  22
## initial  value 473.002802 
## iter  10 value 254.550244
## iter  20 value 220.125021
## iter  30 value 213.343878
## iter  40 value 208.035052
## iter  50 value 207.752441
## iter  60 value 207.748111
## iter  70 value 207.747036
## final  value 207.747024 
## converged
## # weights:  64
## initial  value 465.647120 
## iter  10 value 196.420367
## iter  20 value 159.328410
## iter  30 value 139.218008
## iter  40 value 129.508713
## iter  50 value 126.139981
## iter  60 value 119.857951
## iter  70 value 119.670866
## final  value 119.670626 
## converged
## # weights:  106
## initial  value 474.294101 
## iter  10 value 172.332586
## iter  20 value 104.241060
## iter  30 value 74.822025
## iter  40 value 54.604939
## iter  50 value 48.475248
## iter  60 value 47.373547
## iter  70 value 47.316557
## iter  80 value 47.298114
## iter  90 value 47.293356
## iter 100 value 47.292964
## final  value 47.292964 
## stopped after 100 iterations
## # weights:  22
## initial  value 479.012649 
## iter  10 value 267.571086
## iter  20 value 231.242012
## iter  30 value 225.488835
## iter  40 value 225.374327
## final  value 225.374301 
## converged
## # weights:  64
## initial  value 550.697061 
## iter  10 value 290.093424
## iter  20 value 257.000641
## iter  30 value 211.618951
## iter  40 value 188.963707
## iter  50 value 180.297590
## iter  60 value 174.075784
## iter  70 value 171.804349
## iter  80 value 169.647503
## iter  90 value 163.556362
## iter 100 value 159.292124
## final  value 159.292124 
## stopped after 100 iterations
## # weights:  106
## initial  value 471.213601 
## iter  10 value 199.624866
## iter  20 value 154.313960
## iter  30 value 137.306791
## iter  40 value 127.751942
## iter  50 value 122.768554
## iter  60 value 120.139309
## iter  70 value 119.677513
## iter  80 value 119.548294
## iter  90 value 119.414881
## iter 100 value 119.322355
## final  value 119.322355 
## stopped after 100 iterations
## # weights:  22
## initial  value 500.586130 
## iter  10 value 255.807331
## iter  20 value 221.454065
## iter  30 value 217.933504
## iter  40 value 217.769099
## final  value 217.769009 
## converged
## # weights:  64
## initial  value 484.114592 
## iter  10 value 213.041696
## iter  20 value 180.602941
## iter  30 value 163.084309
## iter  40 value 149.300627
## iter  50 value 143.693313
## iter  60 value 141.043545
## iter  70 value 137.311115
## iter  80 value 135.201418
## iter  90 value 134.685386
## iter 100 value 134.509716
## final  value 134.509716 
## stopped after 100 iterations
## # weights:  106
## initial  value 454.631885 
## iter  10 value 169.145670
## iter  20 value 118.684094
## iter  30 value 96.685369
## iter  40 value 85.433689
## iter  50 value 79.553095
## iter  60 value 77.998135
## iter  70 value 77.005188
## iter  80 value 76.223846
## iter  90 value 75.743310
## iter 100 value 75.218996
## final  value 75.218996 
## stopped after 100 iterations
## # weights:  22
## initial  value 512.661334 
## iter  10 value 222.784278
## iter  20 value 204.436881
## iter  30 value 201.332921
## iter  40 value 200.157159
## iter  50 value 200.151879
## iter  60 value 200.149888
## iter  60 value 200.149887
## final  value 200.149887 
## converged
## # weights:  64
## initial  value 486.755415 
## iter  10 value 194.527407
## iter  20 value 150.461453
## iter  30 value 126.498714
## iter  40 value 120.923475
## iter  50 value 112.711083
## iter  60 value 109.273469
## iter  70 value 106.904499
## iter  80 value 105.037109
## iter  90 value 99.006707
## iter 100 value 98.989995
## final  value 98.989995 
## stopped after 100 iterations
## # weights:  106
## initial  value 506.081299 
## iter  10 value 195.393124
## iter  20 value 145.689935
## iter  30 value 108.998547
## iter  40 value 87.445981
## iter  50 value 76.891005
## iter  60 value 68.743211
## iter  70 value 64.962051
## iter  80 value 63.443468
## iter  90 value 60.955899
## iter 100 value 60.214066
## final  value 60.214066 
## stopped after 100 iterations
## # weights:  22
## initial  value 505.036028 
## iter  10 value 227.094140
## iter  20 value 214.161374
## iter  30 value 210.958922
## iter  40 value 210.260066
## final  value 210.259993 
## converged
## # weights:  64
## initial  value 500.205733 
## iter  10 value 202.229536
## iter  20 value 173.917903
## iter  30 value 164.074391
## iter  40 value 154.763996
## iter  50 value 150.551077
## iter  60 value 145.095692
## iter  70 value 144.021407
## iter  80 value 140.685158
## iter  90 value 140.462658
## iter 100 value 140.450562
## final  value 140.450562 
## stopped after 100 iterations
## # weights:  106
## initial  value 564.180442 
## iter  10 value 216.378474
## iter  20 value 155.460180
## iter  30 value 135.498723
## iter  40 value 123.511305
## iter  50 value 118.400481
## iter  60 value 114.260123
## iter  70 value 113.028272
## iter  80 value 112.491187
## iter  90 value 112.056016
## iter 100 value 111.875482
## final  value 111.875482 
## stopped after 100 iterations
## # weights:  22
## initial  value 482.906725 
## iter  10 value 243.179629
## iter  20 value 218.115209
## iter  30 value 194.162732
## iter  40 value 186.830670
## iter  50 value 182.542224
## iter  60 value 182.436048
## iter  70 value 182.391124
## iter  80 value 182.388481
## iter  90 value 182.384804
## iter 100 value 182.384154
## final  value 182.384154 
## stopped after 100 iterations
## # weights:  64
## initial  value 481.933321 
## iter  10 value 221.431622
## iter  20 value 178.945382
## iter  30 value 162.514686
## iter  40 value 146.220496
## iter  50 value 140.532881
## iter  60 value 136.804132
## iter  70 value 136.472407
## iter  80 value 136.285227
## iter  90 value 136.179449
## iter 100 value 135.597739
## final  value 135.597739 
## stopped after 100 iterations
## # weights:  106
## initial  value 463.124634 
## iter  10 value 191.568092
## iter  20 value 139.788341
## iter  30 value 105.511942
## iter  40 value 94.314450
## iter  50 value 92.073131
## iter  60 value 91.563090
## iter  70 value 91.207490
## iter  80 value 90.830050
## iter  90 value 90.668244
## iter 100 value 90.546249
## final  value 90.546249 
## stopped after 100 iterations
## # weights:  22
## initial  value 459.049658 
## iter  10 value 260.541760
## iter  20 value 215.038009
## iter  30 value 210.755986
## iter  40 value 210.651537
## final  value 210.644022 
## converged
## # weights:  64
## initial  value 528.826159 
## iter  10 value 273.336615
## iter  20 value 199.940823
## iter  30 value 164.822703
## iter  40 value 125.105536
## iter  50 value 96.349189
## iter  60 value 83.763860
## iter  70 value 76.952401
## iter  80 value 75.266133
## iter  90 value 70.228191
## iter 100 value 64.104371
## final  value 64.104371 
## stopped after 100 iterations
## # weights:  106
## initial  value 489.139401 
## iter  10 value 184.760169
## iter  20 value 125.091279
## iter  30 value 73.964782
## iter  40 value 53.641314
## iter  50 value 37.433603
## iter  60 value 36.241075
## iter  70 value 34.768053
## iter  80 value 33.751660
## iter  90 value 31.921795
## iter 100 value 31.734810
## final  value 31.734810 
## stopped after 100 iterations
## # weights:  22
## initial  value 497.958223 
## iter  10 value 261.380657
## iter  20 value 235.781880
## iter  30 value 223.360473
## iter  40 value 220.298545
## iter  50 value 218.279362
## iter  60 value 218.141220
## final  value 218.133342 
## converged
## # weights:  64
## initial  value 549.443253 
## iter  10 value 223.657519
## iter  20 value 179.175539
## iter  30 value 167.652391
## iter  40 value 161.090228
## iter  50 value 158.112903
## iter  60 value 157.650821
## iter  70 value 157.276267
## iter  80 value 157.160176
## iter  90 value 157.158952
## iter  90 value 157.158951
## iter  90 value 157.158951
## final  value 157.158951 
## converged
## # weights:  106
## initial  value 524.854949 
## iter  10 value 205.764840
## iter  20 value 155.445898
## iter  30 value 131.120039
## iter  40 value 118.503570
## iter  50 value 114.558129
## iter  60 value 113.007561
## iter  70 value 112.407700
## iter  80 value 112.145748
## iter  90 value 112.028190
## iter 100 value 111.967715
## final  value 111.967715 
## stopped after 100 iterations
## # weights:  22
## initial  value 510.642206 
## iter  10 value 219.904482
## iter  20 value 212.212300
## iter  30 value 211.404525
## iter  40 value 211.116148
## iter  50 value 210.731995
## iter  60 value 210.655711
## final  value 210.655629 
## converged
## # weights:  64
## initial  value 474.765713 
## iter  10 value 225.902176
## iter  20 value 163.820973
## iter  30 value 130.281733
## iter  40 value 120.823433
## iter  50 value 119.783286
## iter  60 value 119.287789
## iter  70 value 119.079104
## iter  80 value 119.035099
## iter  90 value 119.007853
## iter 100 value 118.981954
## final  value 118.981954 
## stopped after 100 iterations
## # weights:  106
## initial  value 604.413037 
## iter  10 value 183.819594
## iter  20 value 112.652500
## iter  30 value 93.367141
## iter  40 value 79.625730
## iter  50 value 73.703611
## iter  60 value 69.832503
## iter  70 value 65.583595
## iter  80 value 63.599145
## iter  90 value 62.854656
## iter 100 value 62.538296
## final  value 62.538296 
## stopped after 100 iterations
## # weights:  22
## initial  value 498.862017 
## iter  10 value 299.466782
## iter  20 value 240.266139
## iter  30 value 212.568255
## iter  40 value 211.565390
## iter  50 value 211.445103
## iter  60 value 211.440113
## iter  70 value 211.432177
## final  value 211.431729 
## converged
## # weights:  64
## initial  value 557.733086 
## iter  10 value 242.037974
## iter  20 value 173.233903
## iter  30 value 151.087708
## iter  40 value 134.283771
## iter  50 value 118.846379
## iter  60 value 112.004672
## iter  70 value 107.905022
## iter  80 value 105.652060
## iter  90 value 103.732069
## iter 100 value 102.750599
## final  value 102.750599 
## stopped after 100 iterations
## # weights:  106
## initial  value 471.369271 
## iter  10 value 178.760950
## iter  20 value 124.860426
## iter  30 value 96.253996
## iter  40 value 86.230675
## iter  50 value 82.610699
## iter  60 value 80.564343
## iter  70 value 80.271508
## iter  80 value 80.250363
## iter  90 value 80.248700
## final  value 80.248561 
## converged
## # weights:  22
## initial  value 470.901098 
## iter  10 value 273.666198
## iter  20 value 226.959341
## iter  30 value 219.641302
## iter  40 value 219.362926
## iter  40 value 219.362925
## iter  40 value 219.362925
## final  value 219.362925 
## converged
## # weights:  64
## initial  value 581.741410 
## iter  10 value 207.507834
## iter  20 value 172.232015
## iter  30 value 154.140309
## iter  40 value 144.096926
## iter  50 value 141.664061
## iter  60 value 141.084705
## iter  70 value 140.928959
## iter  80 value 140.854244
## iter  90 value 140.847198
## iter 100 value 140.846741
## final  value 140.846741 
## stopped after 100 iterations
## # weights:  106
## initial  value 509.920658 
## iter  10 value 260.556325
## iter  20 value 186.206770
## iter  30 value 155.155795
## iter  40 value 134.123620
## iter  50 value 124.667312
## iter  60 value 120.565237
## iter  70 value 117.915082
## iter  80 value 116.827557
## iter  90 value 116.320336
## iter 100 value 116.099216
## final  value 116.099216 
## stopped after 100 iterations
## # weights:  22
## initial  value 526.537460 
## iter  10 value 239.384390
## iter  20 value 200.348189
## iter  30 value 192.493574
## iter  40 value 186.323234
## iter  50 value 185.953616
## iter  60 value 185.833334
## iter  70 value 185.820309
## iter  80 value 185.817634
## iter  90 value 185.808300
## final  value 185.805323 
## converged
## # weights:  64
## initial  value 510.727682 
## iter  10 value 212.770625
## iter  20 value 155.767218
## iter  30 value 135.948027
## iter  40 value 123.247808
## iter  50 value 114.937054
## iter  60 value 114.483025
## iter  70 value 114.360049
## iter  80 value 114.091058
## iter  90 value 113.588893
## iter 100 value 113.028473
## final  value 113.028473 
## stopped after 100 iterations
## # weights:  106
## initial  value 464.170595 
## iter  10 value 162.667999
## iter  20 value 121.100151
## iter  30 value 96.286568
## iter  40 value 86.799725
## iter  50 value 85.564395
## iter  60 value 83.822232
## iter  70 value 82.669519
## iter  80 value 82.169848
## iter  90 value 81.408269
## iter 100 value 81.299445
## final  value 81.299445 
## stopped after 100 iterations
## # weights:  22
## initial  value 478.943322 
## iter  10 value 303.644592
## iter  20 value 233.434389
## iter  30 value 216.651102
## iter  40 value 200.838559
## iter  50 value 193.716835
## iter  60 value 193.127435
## iter  70 value 191.035474
## iter  80 value 190.025443
## iter  90 value 190.013505
## final  value 190.013499 
## converged
## # weights:  64
## initial  value 504.425690 
## iter  10 value 203.363398
## iter  20 value 167.238755
## iter  30 value 156.679250
## iter  40 value 148.414878
## iter  50 value 143.291764
## iter  60 value 142.502934
## iter  70 value 138.499553
## iter  80 value 138.037698
## iter  90 value 137.918248
## iter 100 value 137.904168
## final  value 137.904168 
## stopped after 100 iterations
## # weights:  106
## initial  value 481.477245 
## iter  10 value 192.576861
## iter  20 value 132.079658
## iter  30 value 110.281384
## iter  40 value 103.372464
## iter  50 value 99.385697
## iter  60 value 97.419461
## iter  70 value 93.168799
## iter  80 value 90.623653
## iter  90 value 89.933215
## iter 100 value 89.622678
## final  value 89.622678 
## stopped after 100 iterations
## # weights:  22
## initial  value 491.610763 
## iter  10 value 297.501858
## iter  20 value 231.466900
## iter  30 value 222.744408
## iter  40 value 220.296143
## iter  50 value 219.303035
## iter  60 value 219.299316
## final  value 219.299157 
## converged
## # weights:  64
## initial  value 590.487792 
## iter  10 value 230.369032
## iter  20 value 185.169380
## iter  30 value 154.852215
## iter  40 value 143.564923
## iter  50 value 140.797697
## iter  60 value 139.973274
## iter  70 value 139.875502
## iter  80 value 139.858280
## final  value 139.858065 
## converged
## # weights:  106
## initial  value 493.235468 
## iter  10 value 214.190470
## iter  20 value 174.355715
## iter  30 value 149.328178
## iter  40 value 136.980767
## iter  50 value 127.421984
## iter  60 value 118.339088
## iter  70 value 113.804229
## iter  80 value 110.117592
## iter  90 value 109.269797
## iter 100 value 108.541212
## final  value 108.541212 
## stopped after 100 iterations
## # weights:  22
## initial  value 508.058447 
## iter  10 value 226.035147
## iter  20 value 216.722474
## iter  30 value 209.809771
## iter  40 value 199.343131
## iter  50 value 199.076756
## iter  60 value 198.895451
## iter  70 value 198.774799
## iter  80 value 198.743152
## iter  90 value 198.727584
## iter 100 value 198.719190
## final  value 198.719190 
## stopped after 100 iterations
## # weights:  64
## initial  value 486.084129 
## iter  10 value 264.594565
## iter  20 value 174.922229
## iter  30 value 143.652584
## iter  40 value 132.070591
## iter  50 value 125.437346
## iter  60 value 123.966746
## iter  70 value 121.983281
## iter  80 value 121.139860
## iter  90 value 117.857999
## iter 100 value 117.044948
## final  value 117.044948 
## stopped after 100 iterations
## # weights:  106
## initial  value 487.202711 
## iter  10 value 217.201177
## iter  20 value 146.203367
## iter  30 value 103.108146
## iter  40 value 75.482401
## iter  50 value 68.185682
## iter  60 value 65.076375
## iter  70 value 63.431612
## iter  80 value 62.788383
## iter  90 value 62.404192
## iter 100 value 61.769077
## final  value 61.769077 
## stopped after 100 iterations
## # weights:  22
## initial  value 475.013364 
## iter  10 value 238.069361
## iter  20 value 215.840330
## iter  30 value 204.796115
## iter  40 value 202.867380
## iter  50 value 200.810843
## iter  60 value 200.792372
## final  value 200.792325 
## converged
## # weights:  64
## initial  value 503.526815 
## iter  10 value 225.030535
## iter  20 value 184.454333
## iter  30 value 153.955323
## iter  40 value 132.045508
## iter  50 value 128.334680
## iter  60 value 126.052215
## iter  70 value 122.807360
## iter  80 value 121.102918
## iter  90 value 118.946167
## iter 100 value 116.190889
## final  value 116.190889 
## stopped after 100 iterations
## # weights:  106
## initial  value 481.979571 
## iter  10 value 191.182122
## iter  20 value 144.133026
## iter  30 value 111.646488
## iter  40 value 99.792482
## iter  50 value 96.750432
## iter  60 value 96.676217
## iter  70 value 96.318028
## final  value 96.317944 
## converged
## # weights:  22
## initial  value 493.541199 
## iter  10 value 263.215267
## iter  20 value 229.177658
## iter  30 value 218.805523
## iter  40 value 213.594430
## iter  50 value 213.193953
## iter  60 value 213.173921
## final  value 213.173455 
## converged
## # weights:  64
## initial  value 481.741090 
## iter  10 value 232.850079
## iter  20 value 202.095248
## iter  30 value 193.777780
## iter  40 value 191.750873
## iter  50 value 191.298838
## iter  60 value 191.080912
## iter  70 value 191.040031
## final  value 191.039926 
## converged
## # weights:  106
## initial  value 524.407039 
## iter  10 value 198.069559
## iter  20 value 150.235219
## iter  30 value 132.806235
## iter  40 value 125.892585
## iter  50 value 120.396848
## iter  60 value 119.874175
## iter  70 value 119.656322
## iter  80 value 119.593056
## iter  90 value 119.583812
## final  value 119.583679 
## converged
## # weights:  22
## initial  value 507.877374 
## iter  10 value 290.469724
## iter  20 value 214.263341
## iter  30 value 207.032191
## iter  40 value 206.839432
## final  value 206.839427 
## converged
## # weights:  64
## initial  value 544.332483 
## iter  10 value 263.716679
## iter  20 value 169.070212
## iter  30 value 158.245439
## iter  40 value 149.112045
## iter  50 value 147.084084
## iter  60 value 145.484352
## iter  70 value 144.433559
## iter  80 value 143.274906
## iter  90 value 141.402830
## iter 100 value 136.991968
## final  value 136.991968 
## stopped after 100 iterations
## # weights:  106
## initial  value 563.791378 
## iter  10 value 186.794851
## iter  20 value 125.431435
## iter  30 value 99.728153
## iter  40 value 92.671367
## iter  50 value 90.174255
## iter  60 value 88.589260
## iter  70 value 88.293344
## iter  80 value 88.069232
## iter  90 value 87.739664
## iter 100 value 87.470174
## final  value 87.470174 
## stopped after 100 iterations
## # weights:  22
## initial  value 493.392245 
## iter  10 value 266.304796
## iter  20 value 223.406217
## iter  30 value 216.174215
## iter  40 value 215.590850
## final  value 215.590027 
## converged
## # weights:  64
## initial  value 505.904683 
## iter  10 value 194.843027
## iter  20 value 169.830163
## iter  30 value 138.160023
## iter  40 value 127.709103
## iter  50 value 123.810311
## iter  60 value 122.881469
## iter  70 value 122.842299
## iter  80 value 122.828871
## iter  90 value 122.828023
## iter  90 value 122.828022
## iter  90 value 122.828022
## final  value 122.828022 
## converged
## # weights:  106
## initial  value 519.962478 
## iter  10 value 159.376976
## iter  20 value 111.450236
## iter  30 value 79.754971
## iter  40 value 74.169728
## iter  50 value 71.136359
## iter  60 value 69.426355
## iter  70 value 67.942746
## iter  80 value 67.061105
## iter  90 value 60.141389
## iter 100 value 57.756187
## final  value 57.756187 
## stopped after 100 iterations
## # weights:  22
## initial  value 508.863348 
## iter  10 value 370.835073
## iter  20 value 233.644507
## iter  30 value 204.842228
## iter  40 value 204.556430
## iter  50 value 204.244145
## iter  60 value 204.182208
## iter  70 value 204.180034
## iter  70 value 204.180034
## iter  70 value 204.180034
## final  value 204.180034 
## converged
## # weights:  64
## initial  value 470.396392 
## iter  10 value 213.935010
## iter  20 value 181.813046
## iter  30 value 167.667879
## iter  40 value 159.768107
## iter  50 value 157.895349
## iter  60 value 157.200178
## iter  70 value 156.881004
## iter  80 value 156.341712
## iter  90 value 156.203455
## iter 100 value 156.193871
## final  value 156.193871 
## stopped after 100 iterations
## # weights:  106
## initial  value 533.732556 
## iter  10 value 202.080702
## iter  20 value 161.682627
## iter  30 value 135.965920
## iter  40 value 120.988542
## iter  50 value 115.270572
## iter  60 value 113.093571
## iter  70 value 112.908044
## iter  80 value 112.881448
## iter  90 value 112.726872
## iter 100 value 112.548979
## final  value 112.548979 
## stopped after 100 iterations
## # weights:  22
## initial  value 488.543920 
## iter  10 value 226.018639
## iter  20 value 200.211022
## iter  30 value 197.781506
## iter  40 value 196.858740
## iter  50 value 195.607146
## iter  60 value 194.959335
## iter  70 value 184.837244
## iter  80 value 184.335744
## iter  90 value 184.155004
## iter 100 value 184.027157
## final  value 184.027157 
## stopped after 100 iterations
## # weights:  64
## initial  value 516.930127 
## iter  10 value 292.804431
## iter  20 value 171.735604
## iter  30 value 141.059408
## iter  40 value 130.526506
## iter  50 value 126.606771
## iter  60 value 124.958605
## iter  70 value 123.032054
## iter  80 value 122.535406
## iter  90 value 121.913148
## iter 100 value 120.353358
## final  value 120.353358 
## stopped after 100 iterations
## # weights:  106
## initial  value 477.420078 
## iter  10 value 187.241350
## iter  20 value 99.699646
## iter  30 value 68.576413
## iter  40 value 57.321711
## iter  50 value 51.631497
## iter  60 value 50.728893
## iter  70 value 49.874032
## iter  80 value 48.843523
## iter  90 value 48.587882
## iter 100 value 48.363431
## final  value 48.363431 
## stopped after 100 iterations
## # weights:  22
## initial  value 512.264548 
## iter  10 value 308.319385
## iter  20 value 214.379586
## iter  30 value 196.265086
## iter  40 value 194.960401
## iter  50 value 194.392275
## iter  60 value 194.378774
## iter  70 value 194.129417
## iter  80 value 193.744297
## iter  90 value 193.611606
## final  value 193.596814 
## converged
## # weights:  64
## initial  value 540.207663 
## iter  10 value 210.177801
## iter  20 value 147.647680
## iter  30 value 122.199455
## iter  40 value 99.280726
## iter  50 value 94.036752
## iter  60 value 93.332485
## iter  70 value 93.264607
## final  value 93.264058 
## converged
## # weights:  106
## initial  value 485.975036 
## iter  10 value 204.601498
## iter  20 value 109.195204
## iter  30 value 93.079074
## iter  40 value 78.300774
## iter  50 value 70.843371
## iter  60 value 69.016262
## iter  70 value 67.798460
## final  value 67.133159 
## converged
## # weights:  22
## initial  value 485.237643 
## iter  10 value 276.640700
## iter  20 value 230.346518
## iter  30 value 215.742995
## iter  40 value 210.656686
## iter  50 value 205.561172
## iter  60 value 204.689651
## iter  70 value 204.628005
## iter  70 value 204.628004
## iter  70 value 204.628004
## final  value 204.628004 
## converged
## # weights:  64
## initial  value 531.799127 
## iter  10 value 213.091481
## iter  20 value 184.567115
## iter  30 value 178.884787
## iter  40 value 176.613577
## iter  50 value 175.912640
## iter  60 value 175.312365
## iter  70 value 175.200508
## iter  80 value 175.198543
## final  value 175.198534 
## converged
## # weights:  106
## initial  value 482.200487 
## iter  10 value 195.338531
## iter  20 value 149.629876
## iter  30 value 135.282359
## iter  40 value 126.054066
## iter  50 value 119.328596
## iter  60 value 113.227013
## iter  70 value 111.603861
## iter  80 value 111.099361
## iter  90 value 110.875258
## iter 100 value 110.725752
## final  value 110.725752 
## stopped after 100 iterations
## # weights:  22
## initial  value 504.225063 
## iter  10 value 220.412856
## iter  20 value 201.161873
## iter  30 value 200.262202
## iter  40 value 198.692326
## iter  50 value 196.786076
## iter  60 value 196.689774
## iter  70 value 196.650378
## iter  80 value 196.633148
## iter  90 value 196.631213
## iter 100 value 196.630505
## final  value 196.630505 
## stopped after 100 iterations
## # weights:  64
## initial  value 463.371822 
## iter  10 value 195.406991
## iter  20 value 158.838900
## iter  30 value 131.819260
## iter  40 value 117.413773
## iter  50 value 114.019130
## iter  60 value 113.750587
## iter  70 value 113.640732
## iter  80 value 113.559342
## iter  90 value 113.157893
## iter 100 value 112.761544
## final  value 112.761544 
## stopped after 100 iterations
## # weights:  106
## initial  value 498.841895 
## iter  10 value 191.534151
## iter  20 value 138.062561
## iter  30 value 105.898363
## iter  40 value 82.478173
## iter  50 value 73.700887
## iter  60 value 72.026076
## iter  70 value 71.375569
## iter  80 value 71.097939
## iter  90 value 70.962397
## iter 100 value 70.726564
## final  value 70.726564 
## stopped after 100 iterations
## # weights:  22
## initial  value 499.703556 
## iter  10 value 268.161353
## iter  20 value 212.329567
## iter  30 value 206.167253
## iter  40 value 204.666061
## iter  50 value 204.212236
## iter  60 value 203.969957
## iter  70 value 203.503030
## iter  80 value 203.502262
## final  value 203.502244 
## converged
## # weights:  64
## initial  value 490.415963 
## iter  10 value 233.060543
## iter  20 value 174.843038
## iter  30 value 152.375654
## iter  40 value 136.830135
## iter  50 value 121.359672
## iter  60 value 113.049020
## iter  70 value 112.131029
## iter  80 value 112.100080
## iter  90 value 112.097458
## iter 100 value 112.089236
## final  value 112.089236 
## stopped after 100 iterations
## # weights:  106
## initial  value 466.315920 
## iter  10 value 175.597917
## iter  20 value 108.166057
## iter  30 value 85.651209
## iter  40 value 76.305458
## iter  50 value 65.452576
## iter  60 value 63.110811
## iter  70 value 62.739244
## iter  80 value 62.390764
## iter  90 value 62.363669
## iter 100 value 62.329649
## final  value 62.329649 
## stopped after 100 iterations
## # weights:  22
## initial  value 495.733276 
## iter  10 value 280.742215
## iter  20 value 217.278616
## iter  30 value 211.415291
## iter  40 value 210.605703
## final  value 210.541856 
## converged
## # weights:  64
## initial  value 559.536771 
## iter  10 value 212.635279
## iter  20 value 178.431693
## iter  30 value 166.366729
## iter  40 value 159.609868
## iter  50 value 153.433546
## iter  60 value 148.105698
## iter  70 value 145.746691
## iter  80 value 144.987663
## iter  90 value 144.901028
## iter 100 value 144.895965
## final  value 144.895965 
## stopped after 100 iterations
## # weights:  106
## initial  value 548.117159 
## iter  10 value 275.615304
## iter  20 value 208.911601
## iter  30 value 174.397581
## iter  40 value 152.972811
## iter  50 value 139.839463
## iter  60 value 130.241130
## iter  70 value 124.975059
## iter  80 value 121.494684
## iter  90 value 120.311465
## iter 100 value 118.630161
## final  value 118.630161 
## stopped after 100 iterations
## # weights:  22
## initial  value 525.591056 
## iter  10 value 217.930522
## iter  20 value 210.575749
## iter  30 value 208.889351
## iter  40 value 205.875679
## iter  50 value 201.350803
## iter  60 value 200.345761
## iter  70 value 198.836643
## iter  80 value 198.541307
## iter  90 value 197.636238
## iter 100 value 197.564516
## final  value 197.564516 
## stopped after 100 iterations
## # weights:  64
## initial  value 516.533878 
## iter  10 value 210.786825
## iter  20 value 153.493145
## iter  30 value 141.890217
## iter  40 value 136.129299
## iter  50 value 135.165209
## iter  60 value 134.803325
## iter  70 value 134.387002
## iter  80 value 134.078877
## iter  90 value 133.875286
## iter 100 value 133.805824
## final  value 133.805824 
## stopped after 100 iterations
## # weights:  106
## initial  value 505.534600 
## iter  10 value 195.254341
## iter  20 value 119.938637
## iter  30 value 88.797527
## iter  40 value 70.812666
## iter  50 value 63.966822
## iter  60 value 63.336507
## iter  70 value 62.683763
## iter  80 value 62.039017
## iter  90 value 60.487278
## iter 100 value 59.947112
## final  value 59.947112 
## stopped after 100 iterations
## # weights:  22
## initial  value 510.032681 
## iter  10 value 319.295701
## iter  20 value 220.228014
## iter  30 value 209.598066
## iter  40 value 208.418128
## iter  50 value 204.642020
## iter  60 value 203.957278
## iter  70 value 203.457815
## iter  80 value 202.401200
## iter  90 value 201.021123
## iter 100 value 200.500799
## final  value 200.500799 
## stopped after 100 iterations
## # weights:  64
## initial  value 552.716803 
## iter  10 value 201.960864
## iter  20 value 178.852930
## iter  30 value 170.543727
## iter  40 value 163.718565
## iter  50 value 160.747801
## iter  60 value 158.543823
## iter  70 value 156.376211
## iter  80 value 155.775612
## iter  90 value 155.281461
## iter 100 value 153.718078
## final  value 153.718078 
## stopped after 100 iterations
## # weights:  106
## initial  value 500.721573 
## iter  10 value 197.642995
## iter  20 value 129.413022
## iter  30 value 92.064396
## iter  40 value 70.161301
## iter  50 value 64.900401
## iter  60 value 63.053354
## iter  70 value 61.139999
## iter  80 value 60.854812
## iter  90 value 60.827831
## iter 100 value 60.826926
## final  value 60.826926 
## stopped after 100 iterations
## # weights:  22
## initial  value 466.586096 
## iter  10 value 243.779287
## iter  20 value 218.838017
## iter  30 value 215.966281
## final  value 215.965957 
## converged
## # weights:  64
## initial  value 470.444755 
## iter  10 value 203.038659
## iter  20 value 188.601364
## iter  30 value 185.693574
## iter  40 value 184.495414
## iter  50 value 182.789926
## iter  60 value 182.537916
## iter  70 value 182.020744
## iter  80 value 182.006688
## final  value 182.006578 
## converged
## # weights:  106
## initial  value 495.592340 
## iter  10 value 202.537850
## iter  20 value 159.684958
## iter  30 value 145.631400
## iter  40 value 129.150611
## iter  50 value 121.806181
## iter  60 value 120.059218
## iter  70 value 118.890030
## iter  80 value 117.542075
## iter  90 value 117.006764
## iter 100 value 116.585523
## final  value 116.585523 
## stopped after 100 iterations
## # weights:  22
## initial  value 501.188666 
## iter  10 value 226.781925
## iter  20 value 213.502880
## iter  30 value 201.318319
## iter  40 value 197.369655
## iter  50 value 197.121607
## iter  60 value 196.925382
## iter  70 value 196.869487
## iter  80 value 196.860268
## iter  90 value 196.848264
## iter 100 value 196.841948
## final  value 196.841948 
## stopped after 100 iterations
## # weights:  64
## initial  value 499.984077 
## iter  10 value 218.317071
## iter  20 value 181.147304
## iter  30 value 164.005233
## iter  40 value 159.103410
## iter  50 value 156.897028
## iter  60 value 155.101053
## iter  70 value 153.982613
## iter  80 value 153.798012
## iter  90 value 153.727221
## iter 100 value 153.544288
## final  value 153.544288 
## stopped after 100 iterations
## # weights:  106
## initial  value 444.902750 
## iter  10 value 183.803618
## iter  20 value 127.335277
## iter  30 value 105.551072
## iter  40 value 98.381020
## iter  50 value 96.136906
## iter  60 value 95.633890
## iter  70 value 95.265146
## iter  80 value 95.034904
## iter  90 value 94.841400
## iter 100 value 94.546322
## final  value 94.546322 
## stopped after 100 iterations
## # weights:  22
## initial  value 460.769218 
## iter  10 value 225.517598
## iter  20 value 206.018643
## iter  30 value 204.141912
## iter  40 value 204.038007
## final  value 204.037852 
## converged
## # weights:  64
## initial  value 558.323940 
## iter  10 value 200.039958
## iter  20 value 153.784619
## iter  30 value 123.504774
## iter  40 value 112.541727
## iter  50 value 107.044187
## iter  60 value 103.131450
## iter  70 value 100.922105
## iter  80 value 97.692639
## iter  90 value 97.611863
## iter 100 value 97.585743
## final  value 97.585743 
## stopped after 100 iterations
## # weights:  106
## initial  value 542.516726 
## iter  10 value 193.496635
## iter  20 value 130.841415
## iter  30 value 116.753482
## iter  40 value 106.460962
## iter  50 value 98.258363
## iter  60 value 93.432201
## iter  70 value 92.520559
## iter  80 value 88.920585
## iter  90 value 67.757902
## iter 100 value 62.750318
## final  value 62.750318 
## stopped after 100 iterations
## # weights:  22
## initial  value 521.469238 
## iter  10 value 228.309291
## iter  20 value 219.131928
## iter  30 value 218.618411
## final  value 218.618140 
## converged
## # weights:  64
## initial  value 466.347150 
## iter  10 value 203.468004
## iter  20 value 182.777270
## iter  30 value 167.476410
## iter  40 value 159.678403
## iter  50 value 156.845381
## iter  60 value 155.563022
## iter  70 value 154.504802
## iter  80 value 154.260185
## iter  90 value 154.258572
## final  value 154.258570 
## converged
## # weights:  106
## initial  value 530.655188 
## iter  10 value 222.374467
## iter  20 value 171.935321
## iter  30 value 145.715803
## iter  40 value 131.438522
## iter  50 value 120.452186
## iter  60 value 114.334985
## iter  70 value 112.249441
## iter  80 value 111.480209
## iter  90 value 111.352968
## iter 100 value 111.293285
## final  value 111.293285 
## stopped after 100 iterations
## # weights:  22
## initial  value 466.041507 
## iter  10 value 243.693451
## iter  20 value 221.753153
## iter  30 value 205.026749
## iter  40 value 194.378145
## iter  50 value 187.480598
## iter  60 value 186.882225
## iter  70 value 186.752125
## iter  80 value 186.713369
## iter  90 value 186.698751
## iter 100 value 186.688206
## final  value 186.688206 
## stopped after 100 iterations
## # weights:  64
## initial  value 475.943212 
## iter  10 value 187.775965
## iter  20 value 139.328852
## iter  30 value 126.060813
## iter  40 value 120.887415
## iter  50 value 118.274856
## iter  60 value 116.940043
## iter  70 value 116.341458
## iter  80 value 115.760633
## iter  90 value 115.501741
## iter 100 value 115.314295
## final  value 115.314295 
## stopped after 100 iterations
## # weights:  106
## initial  value 439.602392 
## iter  10 value 183.684718
## iter  20 value 128.091829
## iter  30 value 103.144530
## iter  40 value 90.615458
## iter  50 value 83.999400
## iter  60 value 76.764294
## iter  70 value 72.241377
## iter  80 value 70.761675
## iter  90 value 70.310353
## iter 100 value 69.428922
## final  value 69.428922 
## stopped after 100 iterations
## # weights:  22
## initial  value 495.887871 
## iter  10 value 233.470678
## iter  20 value 204.827249
## iter  30 value 203.537278
## iter  40 value 198.936542
## iter  50 value 196.446884
## iter  60 value 196.388446
## iter  70 value 196.384367
## iter  80 value 196.379858
## iter  90 value 196.378516
## iter 100 value 196.377827
## final  value 196.377827 
## stopped after 100 iterations
## # weights:  64
## initial  value 474.549515 
## iter  10 value 199.242242
## iter  20 value 167.388311
## iter  30 value 133.026677
## iter  40 value 110.241977
## iter  50 value 102.099602
## iter  60 value 101.083295
## iter  70 value 101.062102
## final  value 101.061966 
## converged
## # weights:  106
## initial  value 494.157228 
## iter  10 value 172.038966
## iter  20 value 111.074683
## iter  30 value 85.670895
## iter  40 value 77.442170
## iter  50 value 73.548999
## iter  60 value 72.345411
## iter  70 value 70.877525
## iter  80 value 70.094726
## iter  90 value 69.527954
## iter 100 value 69.136815
## final  value 69.136815 
## stopped after 100 iterations
## # weights:  22
## initial  value 490.032230 
## iter  10 value 273.397787
## iter  20 value 220.637992
## iter  30 value 214.896801
## iter  40 value 212.830515
## iter  50 value 212.747886
## iter  60 value 212.747476
## final  value 212.747454 
## converged
## # weights:  64
## initial  value 616.359369 
## iter  10 value 245.180830
## iter  20 value 198.061318
## iter  30 value 174.838843
## iter  40 value 167.031488
## iter  50 value 163.596302
## iter  60 value 161.684171
## iter  70 value 161.465366
## iter  80 value 161.464145
## final  value 161.464138 
## converged
## # weights:  106
## initial  value 500.505675 
## iter  10 value 190.705127
## iter  20 value 158.452732
## iter  30 value 145.427491
## iter  40 value 132.727535
## iter  50 value 128.182828
## iter  60 value 126.895693
## iter  70 value 126.482319
## iter  80 value 125.889491
## iter  90 value 124.855151
## iter 100 value 114.873608
## final  value 114.873608 
## stopped after 100 iterations
## # weights:  22
## initial  value 539.658303 
## iter  10 value 365.971003
## iter  20 value 232.462207
## iter  30 value 204.113606
## iter  40 value 200.707014
## iter  50 value 199.552989
## iter  60 value 199.498281
## iter  70 value 199.457016
## iter  80 value 199.452874
## iter  90 value 199.452630
## final  value 199.452347 
## converged
## # weights:  64
## initial  value 553.371213 
## iter  10 value 192.109616
## iter  20 value 128.883149
## iter  30 value 104.108135
## iter  40 value 96.644060
## iter  50 value 94.207858
## iter  60 value 93.582641
## iter  70 value 93.421909
## iter  80 value 93.227196
## iter  90 value 93.055376
## iter 100 value 92.966371
## final  value 92.966371 
## stopped after 100 iterations
## # weights:  106
## initial  value 558.061680 
## iter  10 value 207.052268
## iter  20 value 141.998723
## iter  30 value 101.745090
## iter  40 value 84.383474
## iter  50 value 65.087440
## iter  60 value 59.631691
## iter  70 value 59.164501
## iter  80 value 58.607318
## iter  90 value 58.208268
## iter 100 value 57.603986
## final  value 57.603986 
## stopped after 100 iterations
## # weights:  22
## initial  value 517.260687 
## iter  10 value 214.832727
## iter  20 value 200.832568
## iter  30 value 197.504809
## iter  40 value 194.388286
## iter  50 value 193.001591
## iter  60 value 192.127394
## iter  70 value 191.767595
## iter  80 value 191.490128
## iter  90 value 191.197831
## iter 100 value 191.100024
## final  value 191.100024 
## stopped after 100 iterations
## # weights:  64
## initial  value 504.432573 
## iter  10 value 192.202797
## iter  20 value 154.529445
## iter  30 value 136.306796
## iter  40 value 123.002968
## iter  50 value 120.132422
## iter  60 value 116.987311
## iter  70 value 108.033770
## iter  80 value 106.299013
## iter  90 value 105.812124
## iter 100 value 101.523016
## final  value 101.523016 
## stopped after 100 iterations
## # weights:  106
## initial  value 470.549507 
## iter  10 value 166.069447
## iter  20 value 98.117425
## iter  30 value 83.445177
## iter  40 value 73.562346
## iter  50 value 60.609468
## iter  60 value 57.269380
## iter  70 value 55.719243
## iter  80 value 54.764391
## iter  90 value 54.365289
## iter 100 value 54.166351
## final  value 54.166351 
## stopped after 100 iterations
## # weights:  22
## initial  value 483.855021 
## iter  10 value 248.192854
## iter  20 value 216.219770
## iter  30 value 214.080375
## iter  40 value 211.383292
## iter  50 value 209.295813
## iter  60 value 209.203290
## iter  70 value 209.199916
## iter  80 value 208.224093
## iter  90 value 207.516575
## final  value 207.486880 
## converged
## # weights:  64
## initial  value 493.977726 
## iter  10 value 240.326357
## iter  20 value 209.138377
## iter  30 value 185.129986
## iter  40 value 177.733349
## iter  50 value 175.298619
## iter  60 value 173.537918
## iter  70 value 172.263970
## iter  80 value 172.208037
## iter  90 value 172.205059
## final  value 172.205037 
## converged
## # weights:  106
## initial  value 469.914008 
## iter  10 value 180.087947
## iter  20 value 156.005138
## iter  30 value 144.891519
## iter  40 value 137.148415
## iter  50 value 132.001481
## iter  60 value 130.257066
## iter  70 value 129.328376
## iter  80 value 128.993639
## iter  90 value 128.894498
## iter 100 value 128.893081
## final  value 128.893081 
## stopped after 100 iterations
## # weights:  22
## initial  value 511.562962 
## iter  10 value 273.997377
## iter  20 value 211.475626
## iter  30 value 202.626295
## iter  40 value 202.604618
## final  value 202.597819 
## converged
## # weights:  64
## initial  value 480.983721 
## iter  10 value 183.839869
## iter  20 value 149.609263
## iter  30 value 140.337857
## iter  40 value 130.996405
## iter  50 value 126.605207
## iter  60 value 125.855225
## iter  70 value 125.700967
## iter  80 value 125.456178
## iter  90 value 125.050985
## iter 100 value 124.834724
## final  value 124.834724 
## stopped after 100 iterations
## # weights:  106
## initial  value 524.359174 
## iter  10 value 201.542434
## iter  20 value 155.830591
## iter  30 value 128.731433
## iter  40 value 115.868328
## iter  50 value 111.474402
## iter  60 value 110.013306
## iter  70 value 109.372006
## iter  80 value 108.268515
## iter  90 value 107.946647
## iter 100 value 107.278119
## final  value 107.278119 
## stopped after 100 iterations
## # weights:  22
## initial  value 515.540987 
## iter  10 value 231.276887
## iter  20 value 220.046829
## iter  30 value 211.449300
## iter  40 value 202.205865
## iter  50 value 195.427370
## iter  60 value 193.751781
## iter  70 value 193.745395
## final  value 193.745389 
## converged
## # weights:  64
## initial  value 488.614127 
## iter  10 value 188.637956
## iter  20 value 126.391114
## iter  30 value 103.564100
## iter  40 value 98.815252
## iter  50 value 96.935238
## iter  60 value 96.064148
## iter  70 value 95.914349
## iter  80 value 95.879844
## iter  90 value 95.877846
## final  value 95.877843 
## converged
## # weights:  106
## initial  value 455.362107 
## iter  10 value 173.628678
## iter  20 value 127.943760
## iter  30 value 91.652855
## iter  40 value 78.009491
## iter  50 value 71.390142
## iter  60 value 65.641581
## iter  70 value 64.921543
## iter  80 value 64.897362
## iter  90 value 64.895355
## iter 100 value 64.893844
## final  value 64.893844 
## stopped after 100 iterations
## # weights:  22
## initial  value 473.045754 
## iter  10 value 244.791190
## iter  20 value 227.666512
## iter  30 value 213.895064
## iter  40 value 208.576056
## iter  50 value 208.315612
## iter  60 value 208.313728
## final  value 208.313654 
## converged
## # weights:  64
## initial  value 451.075526 
## iter  10 value 241.432527
## iter  20 value 188.822706
## iter  30 value 165.534025
## iter  40 value 161.727365
## iter  50 value 160.494691
## iter  60 value 144.596981
## iter  70 value 139.388061
## iter  80 value 138.058361
## iter  90 value 136.150015
## iter 100 value 135.268302
## final  value 135.268302 
## stopped after 100 iterations
## # weights:  106
## initial  value 471.562941 
## iter  10 value 181.358411
## iter  20 value 156.331927
## iter  30 value 145.312772
## iter  40 value 130.165462
## iter  50 value 123.530406
## iter  60 value 120.941544
## iter  70 value 120.210719
## iter  80 value 119.879293
## iter  90 value 119.830426
## iter 100 value 119.823286
## final  value 119.823286 
## stopped after 100 iterations
## # weights:  22
## initial  value 483.848293 
## iter  10 value 301.682206
## iter  20 value 208.659185
## iter  30 value 202.464051
## iter  40 value 202.104437
## iter  50 value 201.291682
## iter  60 value 201.049826
## iter  70 value 201.046732
## final  value 201.045468 
## converged
## # weights:  64
## initial  value 555.870979 
## iter  10 value 192.524036
## iter  20 value 126.570538
## iter  30 value 106.338919
## iter  40 value 97.636856
## iter  50 value 91.462720
## iter  60 value 89.715380
## iter  70 value 88.588613
## iter  80 value 87.982672
## iter  90 value 87.629137
## iter 100 value 87.326022
## final  value 87.326022 
## stopped after 100 iterations
## # weights:  106
## initial  value 472.986693 
## iter  10 value 173.157290
## iter  20 value 127.529338
## iter  30 value 94.806812
## iter  40 value 88.287478
## iter  50 value 85.014470
## iter  60 value 78.201744
## iter  70 value 75.229542
## iter  80 value 71.268208
## iter  90 value 67.327955
## iter 100 value 67.003390
## final  value 67.003390 
## stopped after 100 iterations
## # weights:  22
## initial  value 503.835036 
## iter  10 value 258.598679
## iter  20 value 223.000398
## iter  30 value 211.960998
## iter  40 value 197.273722
## iter  50 value 192.746363
## iter  60 value 190.973027
## iter  70 value 189.082588
## iter  80 value 188.687205
## iter  90 value 186.036415
## iter 100 value 185.778627
## final  value 185.778627 
## stopped after 100 iterations
## # weights:  64
## initial  value 468.382035 
## iter  10 value 203.574098
## iter  20 value 167.598964
## iter  30 value 137.278794
## iter  40 value 124.329418
## iter  50 value 118.944769
## iter  60 value 115.687461
## iter  70 value 114.348110
## iter  80 value 113.156840
## iter  90 value 112.663068
## iter 100 value 112.388184
## final  value 112.388184 
## stopped after 100 iterations
## # weights:  106
## initial  value 471.278766 
## iter  10 value 217.119363
## iter  20 value 121.352401
## iter  30 value 96.024085
## iter  40 value 88.022803
## iter  50 value 83.726900
## iter  60 value 79.923680
## iter  70 value 74.811410
## iter  80 value 74.024481
## iter  90 value 61.019476
## iter 100 value 56.514372
## final  value 56.514372 
## stopped after 100 iterations
## # weights:  22
## initial  value 491.556999 
## iter  10 value 241.729292
## iter  20 value 226.190241
## iter  30 value 222.001952
## iter  40 value 221.802029
## final  value 221.802024 
## converged
## # weights:  64
## initial  value 520.594948 
## iter  10 value 214.924839
## iter  20 value 179.595307
## iter  30 value 171.067742
## iter  40 value 169.715990
## iter  50 value 169.284259
## iter  60 value 168.981097
## iter  70 value 168.922896
## iter  80 value 168.902398
## final  value 168.902050 
## converged
## # weights:  106
## initial  value 465.416170 
## iter  10 value 191.494642
## iter  20 value 150.049109
## iter  30 value 134.291123
## iter  40 value 129.056194
## iter  50 value 126.561617
## iter  60 value 125.582437
## iter  70 value 124.018671
## iter  80 value 123.681497
## iter  90 value 123.301549
## iter 100 value 122.514975
## final  value 122.514975 
## stopped after 100 iterations
## # weights:  22
## initial  value 521.388965 
## iter  10 value 222.860991
## iter  20 value 216.147391
## iter  30 value 215.065923
## final  value 215.008697 
## converged
## # weights:  64
## initial  value 464.987451 
## iter  10 value 198.189794
## iter  20 value 172.110457
## iter  30 value 157.424054
## iter  40 value 156.470198
## iter  50 value 156.342971
## iter  60 value 155.909622
## iter  70 value 155.555634
## iter  80 value 155.319836
## iter  90 value 154.857941
## iter 100 value 154.220542
## final  value 154.220542 
## stopped after 100 iterations
## # weights:  106
## initial  value 462.563701 
## iter  10 value 187.500994
## iter  20 value 114.668723
## iter  30 value 90.867648
## iter  40 value 72.098882
## iter  50 value 58.730675
## iter  60 value 56.828519
## iter  70 value 56.500690
## iter  80 value 55.561940
## iter  90 value 55.354682
## iter 100 value 55.143939
## final  value 55.143939 
## stopped after 100 iterations
## # weights:  22
## initial  value 503.073027 
## iter  10 value 266.568385
## iter  20 value 217.724263
## iter  30 value 210.089408
## iter  40 value 203.209121
## iter  50 value 201.730234
## iter  60 value 201.538905
## iter  70 value 200.502936
## iter  80 value 197.988272
## iter  90 value 197.942984
## final  value 197.942960 
## converged
## # weights:  64
## initial  value 463.225910 
## iter  10 value 211.529019
## iter  20 value 158.179824
## iter  30 value 134.759613
## iter  40 value 124.250628
## iter  50 value 118.921417
## iter  60 value 117.018413
## iter  70 value 116.330013
## iter  80 value 116.271961
## iter  90 value 116.102879
## iter 100 value 115.990479
## final  value 115.990479 
## stopped after 100 iterations
## # weights:  106
## initial  value 557.435038 
## iter  10 value 207.391040
## iter  20 value 137.433395
## iter  30 value 94.367417
## iter  40 value 70.397979
## iter  50 value 49.203502
## iter  60 value 48.217860
## iter  70 value 48.118163
## iter  80 value 48.113912
## final  value 48.113891 
## converged
## # weights:  22
## initial  value 470.589756 
## iter  10 value 257.841457
## iter  20 value 239.119308
## iter  30 value 232.787227
## iter  40 value 227.783360
## iter  50 value 221.959796
## iter  60 value 219.698254
## iter  70 value 218.723244
## iter  80 value 218.722942
## final  value 218.722936 
## converged
## # weights:  64
## initial  value 537.609918 
## iter  10 value 237.115283
## iter  20 value 194.165823
## iter  30 value 167.384159
## iter  40 value 150.930152
## iter  50 value 137.164967
## iter  60 value 133.010058
## iter  70 value 131.995469
## iter  80 value 131.761986
## iter  90 value 131.738579
## iter 100 value 131.710357
## final  value 131.710357 
## stopped after 100 iterations
## # weights:  106
## initial  value 459.652324 
## iter  10 value 190.478925
## iter  20 value 171.656328
## iter  30 value 142.269203
## iter  40 value 133.721992
## iter  50 value 129.802858
## iter  60 value 124.854983
## iter  70 value 123.988006
## iter  80 value 123.228425
## iter  90 value 123.066805
## iter 100 value 123.060794
## final  value 123.060794 
## stopped after 100 iterations
## # weights:  22
## initial  value 479.570991 
## iter  10 value 239.588741
## iter  20 value 210.838259
## iter  30 value 204.324875
## iter  40 value 202.009221
## iter  50 value 201.788798
## iter  60 value 201.765551
## iter  70 value 201.761330
## iter  80 value 201.760186
## iter  90 value 201.750955
## iter 100 value 201.750538
## final  value 201.750538 
## stopped after 100 iterations
## # weights:  64
## initial  value 600.451611 
## iter  10 value 206.318694
## iter  20 value 143.571415
## iter  30 value 128.779917
## iter  40 value 117.098451
## iter  50 value 112.326112
## iter  60 value 111.499250
## iter  70 value 111.231636
## iter  80 value 111.155016
## iter  90 value 110.703891
## iter 100 value 110.665171
## final  value 110.665171 
## stopped after 100 iterations
## # weights:  106
## initial  value 478.245264 
## iter  10 value 187.906782
## iter  20 value 139.133392
## iter  30 value 101.679058
## iter  40 value 91.434548
## iter  50 value 88.619352
## iter  60 value 86.130642
## iter  70 value 82.179961
## iter  80 value 81.317389
## iter  90 value 79.211189
## iter 100 value 78.512235
## final  value 78.512235 
## stopped after 100 iterations
## # weights:  22
## initial  value 490.624575 
## iter  10 value 222.640802
## iter  20 value 210.204129
## iter  30 value 207.842664
## final  value 207.820739 
## converged
## # weights:  64
## initial  value 533.390017 
## iter  10 value 214.539914
## iter  20 value 152.390520
## iter  30 value 133.106632
## iter  40 value 121.748357
## iter  50 value 116.226451
## iter  60 value 115.175046
## iter  70 value 114.397849
## iter  80 value 113.958598
## iter  90 value 113.837299
## iter 100 value 113.600500
## final  value 113.600500 
## stopped after 100 iterations
## # weights:  106
## initial  value 629.919946 
## iter  10 value 185.675646
## iter  20 value 125.621372
## iter  30 value 96.091136
## iter  40 value 87.031036
## iter  50 value 82.748751
## iter  60 value 80.227050
## iter  70 value 79.326094
## iter  80 value 78.638379
## iter  90 value 77.723398
## iter 100 value 77.123577
## final  value 77.123577 
## stopped after 100 iterations
## # weights:  22
## initial  value 533.279188 
## iter  10 value 225.101048
## iter  20 value 219.312344
## iter  30 value 214.636135
## iter  40 value 213.943194
## final  value 213.942827 
## converged
## # weights:  64
## initial  value 508.191424 
## iter  10 value 233.604479
## iter  20 value 206.964656
## iter  30 value 201.713659
## iter  40 value 197.634508
## iter  50 value 185.914558
## iter  60 value 182.784930
## iter  70 value 180.769323
## iter  80 value 180.145534
## iter  90 value 180.007041
## iter 100 value 179.940679
## final  value 179.940679 
## stopped after 100 iterations
## # weights:  106
## initial  value 499.185031 
## iter  10 value 213.786829
## iter  20 value 164.428289
## iter  30 value 136.221598
## iter  40 value 123.887468
## iter  50 value 120.483084
## iter  60 value 117.818744
## iter  70 value 115.271610
## iter  80 value 112.719480
## iter  90 value 110.833728
## iter 100 value 108.899260
## final  value 108.899260 
## stopped after 100 iterations
## # weights:  22
## initial  value 538.341476 
## iter  10 value 237.496285
## iter  20 value 214.833840
## iter  30 value 211.636876
## iter  40 value 207.646113
## iter  50 value 205.566261
## iter  60 value 205.486486
## iter  70 value 205.452306
## iter  80 value 205.441142
## iter  90 value 205.430359
## iter 100 value 205.424095
## final  value 205.424095 
## stopped after 100 iterations
## # weights:  64
## initial  value 546.609356 
## iter  10 value 196.046443
## iter  20 value 163.379190
## iter  30 value 150.941530
## iter  40 value 139.989574
## iter  50 value 132.464421
## iter  60 value 131.689669
## iter  70 value 131.417755
## iter  80 value 131.168936
## iter  90 value 130.905867
## iter 100 value 130.399112
## final  value 130.399112 
## stopped after 100 iterations
## # weights:  106
## initial  value 510.598170 
## iter  10 value 192.445633
## iter  20 value 97.861480
## iter  30 value 81.337435
## iter  40 value 74.442601
## iter  50 value 71.031093
## iter  60 value 68.804470
## iter  70 value 68.133115
## iter  80 value 67.547104
## iter  90 value 66.649585
## iter 100 value 65.850290
## final  value 65.850290 
## stopped after 100 iterations
## # weights:  22
## initial  value 482.277553 
## iter  10 value 228.008550
## iter  20 value 206.821881
## iter  30 value 200.761631
## iter  40 value 197.892023
## iter  50 value 197.586067
## iter  60 value 197.577746
## iter  70 value 197.575328
## final  value 197.574563 
## converged
## # weights:  64
## initial  value 517.528887 
## iter  10 value 228.219852
## iter  20 value 155.951297
## iter  30 value 120.490872
## iter  40 value 93.209961
## iter  50 value 87.118503
## iter  60 value 85.365829
## iter  70 value 84.694504
## iter  80 value 83.771429
## iter  90 value 83.215521
## iter 100 value 83.019680
## final  value 83.019680 
## stopped after 100 iterations
## # weights:  106
## initial  value 601.950965 
## iter  10 value 215.549978
## iter  20 value 150.788992
## iter  30 value 133.393255
## iter  40 value 116.117468
## iter  50 value 108.970612
## iter  60 value 103.296244
## iter  70 value 100.036144
## iter  80 value 98.580067
## iter  90 value 97.060140
## iter 100 value 96.544356
## final  value 96.544356 
## stopped after 100 iterations
## # weights:  22
## initial  value 478.527215 
## iter  10 value 304.898358
## iter  20 value 259.597579
## iter  30 value 228.066986
## iter  40 value 216.348298
## iter  50 value 212.528975
## iter  60 value 211.513613
## iter  70 value 211.442415
## final  value 211.442411 
## converged
## # weights:  64
## initial  value 491.497249 
## iter  10 value 281.608746
## iter  20 value 229.889456
## iter  30 value 183.816798
## iter  40 value 164.195440
## iter  50 value 154.968970
## iter  60 value 149.067140
## iter  70 value 145.639178
## iter  80 value 144.180578
## iter  90 value 143.833329
## iter 100 value 143.751144
## final  value 143.751144 
## stopped after 100 iterations
## # weights:  106
## initial  value 561.592704 
## iter  10 value 248.969846
## iter  20 value 196.311444
## iter  30 value 164.283067
## iter  40 value 138.360388
## iter  50 value 128.524126
## iter  60 value 117.399882
## iter  70 value 114.585085
## iter  80 value 113.938279
## iter  90 value 113.676978
## iter 100 value 113.497700
## final  value 113.497700 
## stopped after 100 iterations
## # weights:  22
## initial  value 504.884748 
## iter  10 value 229.078881
## iter  20 value 208.336919
## iter  30 value 205.227520
## iter  40 value 204.679952
## final  value 204.677332 
## converged
## # weights:  64
## initial  value 472.753790 
## iter  10 value 173.304453
## iter  20 value 131.657605
## iter  30 value 119.242983
## iter  40 value 106.766844
## iter  50 value 101.878823
## iter  60 value 98.834684
## iter  70 value 98.599788
## iter  80 value 98.242664
## iter  90 value 98.021181
## iter 100 value 97.949987
## final  value 97.949987 
## stopped after 100 iterations
## # weights:  106
## initial  value 571.026805 
## iter  10 value 207.663813
## iter  20 value 133.913842
## iter  30 value 83.396009
## iter  40 value 69.472277
## iter  50 value 56.934879
## iter  60 value 50.527696
## iter  70 value 49.413069
## iter  80 value 49.035627
## iter  90 value 48.202948
## iter 100 value 47.578535
## final  value 47.578535 
## stopped after 100 iterations
## # weights:  22
## initial  value 482.549408 
## iter  10 value 275.280775
## iter  20 value 215.460663
## iter  30 value 207.963343
## iter  40 value 199.940581
## iter  50 value 199.659451
## iter  60 value 199.650021
## iter  70 value 199.647067
## iter  80 value 199.646226
## iter  90 value 199.645671
## final  value 199.645579 
## converged
## # weights:  64
## initial  value 487.874948 
## iter  10 value 211.956124
## iter  20 value 173.447508
## iter  30 value 140.080236
## iter  40 value 131.029077
## iter  50 value 120.309696
## iter  60 value 115.781480
## iter  70 value 113.872838
## iter  80 value 113.496045
## iter  90 value 112.990074
## iter 100 value 112.500253
## final  value 112.500253 
## stopped after 100 iterations
## # weights:  106
## initial  value 489.480874 
## iter  10 value 224.101220
## iter  20 value 145.540325
## iter  30 value 101.524191
## iter  40 value 80.412037
## iter  50 value 74.679314
## iter  60 value 74.271843
## iter  70 value 74.266162
## final  value 74.265773 
## converged
## # weights:  22
## initial  value 479.191873 
## iter  10 value 305.277501
## iter  20 value 265.345831
## iter  30 value 249.654598
## iter  40 value 225.009267
## iter  50 value 220.045811
## iter  60 value 218.518627
## final  value 218.512131 
## converged
## # weights:  64
## initial  value 457.431317 
## iter  10 value 221.269451
## iter  20 value 176.902649
## iter  30 value 164.202151
## iter  40 value 154.836446
## iter  50 value 153.103268
## iter  60 value 152.043622
## iter  70 value 151.844812
## iter  80 value 151.818931
## final  value 151.818855 
## converged
## # weights:  106
## initial  value 508.572741 
## iter  10 value 240.174383
## iter  20 value 190.497945
## iter  30 value 164.135414
## iter  40 value 153.082995
## iter  50 value 140.871428
## iter  60 value 132.947766
## iter  70 value 128.267256
## iter  80 value 119.078645
## iter  90 value 117.614718
## iter 100 value 116.692207
## final  value 116.692207 
## stopped after 100 iterations
## # weights:  22
## initial  value 476.697854 
## iter  10 value 334.967937
## iter  20 value 282.664015
## iter  30 value 222.267958
## iter  40 value 212.981549
## iter  50 value 199.071730
## iter  60 value 195.214385
## iter  70 value 187.816529
## iter  80 value 187.742041
## iter  90 value 187.704229
## iter 100 value 187.691558
## final  value 187.691558 
## stopped after 100 iterations
## # weights:  64
## initial  value 566.504548 
## iter  10 value 299.130276
## iter  20 value 195.857937
## iter  30 value 170.559109
## iter  40 value 156.982426
## iter  50 value 145.195198
## iter  60 value 142.071571
## iter  70 value 141.559744
## iter  80 value 140.008499
## iter  90 value 139.206068
## iter 100 value 137.810607
## final  value 137.810607 
## stopped after 100 iterations
## # weights:  106
## initial  value 550.878011 
## iter  10 value 184.486618
## iter  20 value 121.682562
## iter  30 value 94.252018
## iter  40 value 71.277466
## iter  50 value 66.661087
## iter  60 value 64.965166
## iter  70 value 63.606443
## iter  80 value 61.708375
## iter  90 value 61.138278
## iter 100 value 60.379710
## final  value 60.379710 
## stopped after 100 iterations
## # weights:  22
## initial  value 496.590281 
## iter  10 value 241.469850
## iter  20 value 209.832480
## iter  30 value 206.065956
## iter  40 value 195.006639
## iter  50 value 194.795961
## iter  60 value 194.794110
## iter  70 value 194.792551
## final  value 194.792126 
## converged
## # weights:  64
## initial  value 589.639089 
## iter  10 value 201.668448
## iter  20 value 143.893677
## iter  30 value 111.524107
## iter  40 value 96.684742
## iter  50 value 87.518577
## iter  60 value 80.561443
## iter  70 value 70.783637
## iter  80 value 68.451110
## iter  90 value 66.329588
## iter 100 value 66.256496
## final  value 66.256496 
## stopped after 100 iterations
## # weights:  106
## initial  value 521.704818 
## iter  10 value 193.685918
## iter  20 value 137.947185
## iter  30 value 100.709722
## iter  40 value 67.983825
## iter  50 value 58.484890
## iter  60 value 55.499838
## iter  70 value 52.144689
## iter  80 value 51.700068
## iter  90 value 51.419255
## iter 100 value 51.310597
## final  value 51.310597 
## stopped after 100 iterations
## # weights:  22
## initial  value 474.611216 
## iter  10 value 253.598154
## iter  20 value 224.535685
## iter  30 value 217.398120
## final  value 217.393113 
## converged
## # weights:  64
## initial  value 483.778745 
## iter  10 value 200.896445
## iter  20 value 157.502901
## iter  30 value 150.192851
## iter  40 value 147.801373
## iter  50 value 147.007784
## iter  60 value 146.797344
## iter  70 value 146.767961
## final  value 146.767079 
## converged
## # weights:  106
## initial  value 478.671866 
## iter  10 value 189.861550
## iter  20 value 143.557450
## iter  30 value 133.184573
## iter  40 value 126.180046
## iter  50 value 124.318961
## iter  60 value 123.155455
## iter  70 value 122.970030
## iter  80 value 122.916477
## iter  90 value 122.224699
## iter 100 value 121.692491
## final  value 121.692491 
## stopped after 100 iterations
## # weights:  22
## initial  value 496.045000 
## iter  10 value 281.283588
## iter  20 value 250.801994
## iter  30 value 217.136551
## iter  40 value 210.466873
## iter  50 value 206.798219
## iter  60 value 204.728980
## iter  70 value 202.882020
## iter  80 value 202.845438
## iter  90 value 202.825884
## iter 100 value 202.824628
## final  value 202.824628 
## stopped after 100 iterations
## # weights:  64
## initial  value 506.447764 
## iter  10 value 227.987444
## iter  20 value 168.497994
## iter  30 value 153.622286
## iter  40 value 138.335234
## iter  50 value 126.665422
## iter  60 value 121.527205
## iter  70 value 119.696532
## iter  80 value 119.178456
## iter  90 value 118.962614
## iter 100 value 117.359415
## final  value 117.359415 
## stopped after 100 iterations
## # weights:  106
## initial  value 463.503059 
## iter  10 value 211.316842
## iter  20 value 125.595613
## iter  30 value 96.120852
## iter  40 value 82.929951
## iter  50 value 80.176199
## iter  60 value 79.352062
## iter  70 value 78.792841
## iter  80 value 78.327423
## iter  90 value 77.452530
## iter 100 value 76.822641
## final  value 76.822641 
## stopped after 100 iterations
## # weights:  106
## initial  value 585.930121 
## iter  10 value 202.208353
## iter  20 value 136.327012
## iter  30 value 96.791432
## iter  40 value 66.134100
## iter  50 value 48.497589
## iter  60 value 45.980674
## iter  70 value 45.212678
## iter  80 value 45.208154
## iter  90 value 45.208076
## final  value 45.208076 
## converged

Auffallend war, dass dieses Modell länger gerechnet hat, dies hängt vermutlich damit zusammen, dass es durch das Dummy Encoden mehr Inputvariablen zum Verarbeiten hat.

plot(modell_nn5)

Die beiden neuronalen Netze mit der Train Function von Caret performen sehr ähnlich. Auch der beste Fit von den Parametern identisch. Der einzige Unterschied liegt darin, dass die Modelle mit geringeren Weights beim Modell mit One Hot Encodeden Daten und upgesamplten Trainingsset bei mehr Hidden Units besser performen. Aber der Unterschied ist im dezimalen Prozentbereich.

modell_nn5_best <- modell_nn5$bestTune
modell_nn5_best
##   size decay
## 7    5     0
predict_testNN_5 = predict(modell_nn5, testset_nn)
#predict_testNN_5 <-sapply(predict_testNN_5,round,digits=0)
nn_table5 <- table(testset_nn$target, predict_testNN_5)
results_nn5 <- data.frame(actual = testset_nn$target, prediction = predict_testNN_5)
conf_nn5 <- confusionMatrix(nn_table5)
conf_nn5
## Confusion Matrix and Statistics
## 
##    predict_testNN_5
##      0  1
##   0 74 20
##   1  2  9
##                                           
##                Accuracy : 0.7905          
##                  95% CI : (0.7001, 0.8638)
##     No Information Rate : 0.7238          
##     P-Value [Acc > NIR] : 0.0751910       
##                                           
##                   Kappa : 0.3515          
##                                           
##  Mcnemar's Test P-Value : 0.0002896       
##                                           
##             Sensitivity : 0.9737          
##             Specificity : 0.3103          
##          Pos Pred Value : 0.7872          
##          Neg Pred Value : 0.8182          
##              Prevalence : 0.7238          
##          Detection Rate : 0.7048          
##    Detection Prevalence : 0.8952          
##       Balanced Accuracy : 0.6420          
##                                           
##        'Positive' Class : 0               
## 

Dieses Modell kommt auf eine Testaccuracy von 79 %. Auffällig ist, dass dieses Modell 20 Patient als Corona Infiziert ausgeben, obwohl die Patienten gesund sind. Wobei lediglich 2 Patienten fälschlicherweise als gesund ausgegeben werden. Die Specificity ist trotzdem sehr schwach mit nur etwas über 31 %.

acc_nn5 <- conf_nn5$overall[1]
sens_nn5 <- conf_nn5$byClass[1]
spec_nn5 <- conf_nn5$byClass[2]

In der Übersicht erkennt man sehr deutlich, dass die beiden Neuronale Netze mit der nnet Function (einmal auf Skalierten und Encodeden Daten und einmal auf nicht vorverarbeiten) sehr ähnlich abschneiden, mit einer TestAccuracy von ca. 75%. Aber für die Vorhersage von tatsächlich Corona Infizierten sind diese beiden Modelle nicht nützlich, da Sie nur eine Specificity von 25% aufweisen. Das beiden Neuronalen Netze mit der Caret Library hingegen kommt auf eine Accuracy von 100% und sagt samit alle 289 Patienten im Testdatensatz korrekt voraus. Das Netz wurde mit 10 Fold Cross Validation und 3 facher Wiederholhung trainiert. Die Daten wurden zu dem min-max skaliert.

library(kableExtra)
modell <- c(2,3,4,5)
test_acc <- c(acc_nn2, acc_nn3, acc_nn4, acc_nn5)
sens <- c(sens_nn2, sens_nn3, sens_nn4, sens_nn5)
spec <- c(spec_nn2, spec_nn3, spec_nn4, spec_nn5)
results_nn = data.frame(
  "model" = modell,
  "sensitivity" = sens,
  "Specificity" = spec,
  "Test Accuracy" = test_acc
)

kable_styling(kable(results_nn, format = "html", digits = 4), full_width = FALSE)
model sensitivity Specificity Test.Accuracy
2 0.9231 0.1852 0.7333
3 0.9643 0.3810 0.8476
4 0.9474 0.6000 0.9143
5 0.9737 0.3103 0.7905
train_eng_nn <- read.csv("data/clean/train_feat_eng.csv")
test_eng_nn <- read.csv("data/clean/test_feat_eng.csv")
glimpse(train_eng_nn)
## Rows: 760
## Columns: 16
## $ age               <int> 17, 1, 9, 11, 13, 9, 17, 17, 19, 10, 11, 11, 16, 15…
## $ target            <int> 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, …
## $ reg_ward          <int> 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, …
## $ semi_unit         <int> 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, …
## $ intense_unit      <int> 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, …
## $ sickness          <int> 1, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0, 1, 1, 0, 1, 1, 1, …
## $ Hematocrit        <dbl> 0.23651545, -1.57168221, -0.74769306, 0.99183822, 1…
## $ Platelets         <dbl> -0.51741302, 1.42966747, -0.42948034, 0.07299204, -…
## $ Platelets_vol     <dbl> 0.01067657, -1.67222178, -0.21371073, -0.55028951, …
## $ Lymphocytes       <dbl> 0.318365753, -0.005738043, -1.114513755, 0.04543625…
## $ mean_hemoglobin   <dbl> -0.95079035, 3.33107066, 0.54288238, -0.45289949, -…
## $ Leukocytes        <dbl> -9.461035e-02, 3.645505e-01, -8.849232e-01, -2.1148…
## $ Eosinophils       <dbl> 1.48215818, 1.01862502, -0.66695017, -0.70908952, 0…
## $ Monocytes         <dbl> 0.35754666, 0.06865151, 1.27675891, -0.22024387, 0.…
## $ age_plat_leuk_eos <dbl> 19.401150, 5.945848, 18.735579, 17.303070, 17.41957…
## $ age_leuk_eos      <dbl> 10.059854, 4.281499, 9.859892, 9.911350, 9.564118, …
set.seed(1910837262)
up_train_eng_nn <- upSample(x = train_eng_nn[, -ncol(train_eng_nn)],
                     y = as.factor(train_eng_nn$target))                        
table(up_train_eng_nn$target) 
## 
##   0   1 
## 380 380
up_train_eng_nn <- up_train_eng_nn %>%
  select(-Class)

up_train_eng_nn_x <- up_train_eng_nn %>%
  select(-target)

Nun trainieren wir noch ein Neuronales Netz mit den Inputdaten, die wir durch das Feature Engineering ein wenig verändert haben. Die weiteren Prozessschritte lassen wir aber identisch.

modell_nn6 <- train(up_train_eng_nn[,-2], up_train_eng_nn$target,
                  method = "nnet",
                  trControl= TrainingParameters_nn,
                  preProcess=c("scale","center"),
                  na.action = na.omit
)
## Warning in train.default(up_train_eng_nn[, -2], up_train_eng_nn$target, : You
## are trying to do regression and your outcome only has two possible values Are
## you trying to do classification? If so, use a 2 level factor as your outcome
## column.
## # weights:  17
## initial  value 171.345733 
## iter  10 value 128.000002
## iter  10 value 128.000001
## iter  10 value 128.000001
## final  value 128.000001 
## converged
## # weights:  49
## initial  value 201.210288 
## iter  10 value 104.044493
## iter  20 value 96.693825
## iter  30 value 95.692730
## iter  40 value 95.691431
## iter  50 value 94.665053
## iter  60 value 88.326844
## iter  70 value 84.656432
## iter  80 value 84.027029
## iter  90 value 83.905825
## iter 100 value 83.026560
## final  value 83.026560 
## stopped after 100 iterations
## # weights:  81
## initial  value 178.180649 
## iter  10 value 91.352462
## iter  20 value 83.003524
## iter  30 value 82.996752
## iter  40 value 82.995457
## iter  50 value 82.988790
## iter  60 value 80.798693
## iter  70 value 78.831166
## iter  80 value 78.798153
## iter  90 value 78.632406
## iter 100 value 76.802411
## final  value 76.802411 
## stopped after 100 iterations
## # weights:  17
## initial  value 177.104368 
## iter  10 value 72.656883
## iter  20 value 63.677760
## iter  30 value 63.033007
## final  value 63.025587 
## converged
## # weights:  49
## initial  value 174.620971 
## iter  10 value 91.643069
## iter  20 value 62.158502
## iter  30 value 56.348794
## iter  40 value 55.136977
## iter  50 value 54.662901
## iter  60 value 50.003806
## iter  70 value 49.482208
## iter  80 value 49.430187
## final  value 49.429809 
## converged
## # weights:  81
## initial  value 217.427529 
## iter  10 value 90.687721
## iter  20 value 56.010953
## iter  30 value 46.923848
## iter  40 value 46.060359
## iter  50 value 45.741460
## iter  60 value 45.369228
## iter  70 value 44.980336
## iter  80 value 44.288574
## iter  90 value 43.267286
## iter 100 value 42.284157
## final  value 42.284157 
## stopped after 100 iterations
## # weights:  17
## initial  value 203.352562 
## iter  10 value 100.273876
## iter  20 value 93.023094
## iter  30 value 77.991189
## iter  40 value 72.012273
## iter  50 value 70.099354
## iter  60 value 70.087049
## iter  70 value 70.083957
## iter  80 value 69.889246
## iter  90 value 68.963301
## iter 100 value 61.783012
## final  value 61.783012 
## stopped after 100 iterations
## # weights:  49
## initial  value 157.127268 
## iter  10 value 120.639996
## iter  20 value 114.571690
## iter  30 value 109.380720
## iter  40 value 95.476896
## iter  50 value 87.334573
## iter  60 value 80.549543
## iter  70 value 78.911834
## iter  80 value 77.456415
## iter  90 value 77.047322
## iter 100 value 76.094057
## final  value 76.094057 
## stopped after 100 iterations
## # weights:  81
## initial  value 185.767949 
## iter  10 value 125.914183
## iter  20 value 113.715865
## iter  30 value 97.705113
## iter  40 value 94.745816
## iter  50 value 94.742373
## iter  60 value 94.738079
## iter  70 value 94.732598
## iter  80 value 94.726497
## iter  90 value 94.718500
## iter 100 value 94.675853
## final  value 94.675853 
## stopped after 100 iterations
## # weights:  17
## initial  value 170.680068 
## iter  10 value 66.347317
## iter  20 value 56.938466
## iter  30 value 56.477297
## iter  40 value 54.800785
## iter  50 value 54.596861
## iter  60 value 54.534954
## iter  70 value 54.474787
## iter  80 value 54.438047
## iter  90 value 54.429811
## iter 100 value 54.416545
## final  value 54.416545 
## stopped after 100 iterations
## # weights:  49
## initial  value 194.654794 
## iter  10 value 107.000905
## iter  20 value 107.000001
## iter  20 value 107.000000
## iter  20 value 107.000000
## final  value 107.000000 
## converged
## # weights:  81
## initial  value 172.967315 
## iter  10 value 105.999991
## iter  20 value 105.999436
## iter  30 value 100.588325
## iter  40 value 87.977004
## iter  50 value 81.778893
## iter  60 value 77.118054
## iter  70 value 77.003350
## iter  80 value 77.000773
## iter  90 value 77.000413
## iter 100 value 77.000122
## final  value 77.000122 
## stopped after 100 iterations
## # weights:  17
## initial  value 173.520568 
## iter  10 value 118.496851
## iter  20 value 71.683496
## iter  30 value 63.642325
## iter  40 value 63.517734
## iter  50 value 63.498360
## iter  60 value 63.480293
## final  value 63.480261 
## converged
## # weights:  49
## initial  value 184.450994 
## iter  10 value 69.484136
## iter  20 value 56.927251
## iter  30 value 53.961280
## iter  40 value 52.328670
## iter  50 value 51.884202
## iter  60 value 51.851214
## iter  70 value 51.847639
## final  value 51.847614 
## converged
## # weights:  81
## initial  value 222.308002 
## iter  10 value 84.266596
## iter  20 value 59.096787
## iter  30 value 55.407557
## iter  40 value 52.004365
## iter  50 value 50.028585
## iter  60 value 49.305397
## iter  70 value 48.876715
## iter  80 value 48.790703
## iter  90 value 48.721597
## iter 100 value 48.615977
## final  value 48.615977 
## stopped after 100 iterations
## # weights:  17
## initial  value 175.083922 
## iter  10 value 84.744334
## iter  20 value 73.121407
## iter  30 value 72.762641
## iter  40 value 72.668407
## iter  50 value 72.643737
## iter  60 value 72.641143
## iter  70 value 72.639337
## iter  80 value 72.638289
## iter  90 value 72.636337
## iter 100 value 72.635536
## final  value 72.635536 
## stopped after 100 iterations
## # weights:  49
## initial  value 183.852397 
## iter  10 value 118.810689
## iter  20 value 107.107331
## iter  30 value 83.075698
## iter  40 value 59.016393
## iter  50 value 55.727656
## iter  60 value 54.553524
## iter  70 value 53.509503
## iter  80 value 52.053692
## iter  90 value 50.869502
## iter 100 value 49.575288
## final  value 49.575288 
## stopped after 100 iterations
## # weights:  81
## initial  value 208.988793 
## iter  10 value 91.301449
## iter  20 value 91.189707
## iter  30 value 89.558909
## iter  40 value 79.678487
## iter  50 value 71.245587
## iter  60 value 63.315387
## iter  70 value 61.191882
## iter  80 value 61.160502
## iter  90 value 61.113563
## iter 100 value 49.470558
## final  value 49.470558 
## stopped after 100 iterations
## # weights:  17
## initial  value 173.302807 
## iter  10 value 103.662173
## iter  20 value 93.887900
## iter  30 value 87.046635
## iter  40 value 84.005627
## iter  50 value 83.653506
## iter  60 value 81.730767
## iter  70 value 80.066134
## iter  80 value 79.994609
## iter  90 value 79.982844
## iter 100 value 73.867214
## final  value 73.867214 
## stopped after 100 iterations
## # weights:  49
## initial  value 222.601520 
## iter  10 value 91.581922
## iter  20 value 84.028087
## iter  30 value 75.109033
## iter  40 value 52.064389
## iter  50 value 46.222759
## iter  60 value 45.528408
## iter  70 value 44.497497
## iter  80 value 44.449914
## iter  90 value 44.423158
## iter 100 value 44.390987
## final  value 44.390987 
## stopped after 100 iterations
## # weights:  81
## initial  value 186.069752 
## iter  10 value 101.168491
## iter  20 value 89.429545
## iter  30 value 77.962745
## iter  40 value 76.825853
## iter  50 value 76.803404
## iter  60 value 76.800433
## iter  70 value 76.800114
## final  value 76.800072 
## converged
## # weights:  17
## initial  value 177.127920 
## iter  10 value 98.130900
## iter  20 value 73.351621
## iter  30 value 65.025879
## iter  40 value 63.039958
## iter  50 value 63.025267
## final  value 63.025135 
## converged
## # weights:  49
## initial  value 177.357009 
## iter  10 value 66.597480
## iter  20 value 55.977352
## iter  30 value 53.814651
## iter  40 value 53.446205
## iter  50 value 53.261386
## iter  60 value 53.257721
## final  value 53.257701 
## converged
## # weights:  81
## initial  value 187.161465 
## iter  10 value 71.889059
## iter  20 value 62.389248
## iter  30 value 56.311374
## iter  40 value 52.984925
## iter  50 value 51.744373
## iter  60 value 51.349058
## iter  70 value 51.166127
## iter  80 value 50.950193
## iter  90 value 50.867514
## iter 100 value 50.780376
## final  value 50.780376 
## stopped after 100 iterations
## # weights:  17
## initial  value 171.772837 
## iter  10 value 118.772193
## iter  20 value 118.752858
## iter  30 value 118.661424
## iter  40 value 106.892119
## iter  50 value 103.964610
## iter  60 value 99.053701
## iter  70 value 98.012839
## iter  80 value 97.032227
## iter  90 value 93.376498
## iter 100 value 92.021195
## final  value 92.021195 
## stopped after 100 iterations
## # weights:  49
## initial  value 179.078192 
## iter  10 value 56.363085
## iter  20 value 40.289703
## iter  30 value 34.323467
## iter  40 value 32.273890
## iter  50 value 31.580798
## iter  60 value 31.470410
## iter  70 value 31.453693
## iter  80 value 31.432592
## iter  90 value 31.425683
## iter 100 value 31.393969
## final  value 31.393969 
## stopped after 100 iterations
## # weights:  81
## initial  value 203.092903 
## iter  10 value 105.157002
## iter  20 value 103.143072
## iter  30 value 102.341710
## iter  40 value 102.092306
## iter  50 value 55.268431
## iter  60 value 34.502158
## iter  70 value 29.141467
## iter  80 value 27.852603
## iter  90 value 27.263602
## iter 100 value 27.174666
## final  value 27.174666 
## stopped after 100 iterations
## # weights:  17
## initial  value 176.334956 
## iter  10 value 91.624714
## iter  20 value 90.969689
## iter  30 value 90.347167
## iter  40 value 87.351068
## iter  50 value 84.787363
## iter  60 value 84.135988
## iter  70 value 81.483159
## iter  80 value 81.257667
## iter  90 value 78.457604
## iter 100 value 75.356132
## final  value 75.356132 
## stopped after 100 iterations
## # weights:  49
## initial  value 152.253658 
## iter  10 value 107.938174
## iter  20 value 92.739698
## iter  30 value 91.889610
## iter  40 value 88.765634
## iter  50 value 84.923828
## iter  60 value 81.772462
## iter  70 value 80.264264
## iter  80 value 78.631665
## iter  90 value 75.356384
## iter 100 value 74.694613
## final  value 74.694613 
## stopped after 100 iterations
## # weights:  81
## initial  value 170.710025 
## iter  10 value 66.236924
## iter  20 value 55.279367
## iter  30 value 49.094185
## iter  40 value 47.951951
## iter  50 value 47.857757
## iter  60 value 47.849159
## iter  70 value 47.847769
## iter  80 value 47.834037
## iter  90 value 46.959750
## iter 100 value 46.856982
## final  value 46.856982 
## stopped after 100 iterations
## # weights:  17
## initial  value 189.597818 
## iter  10 value 106.136257
## iter  20 value 69.672426
## iter  30 value 61.830505
## iter  40 value 61.414658
## final  value 61.414356 
## converged
## # weights:  49
## initial  value 196.955128 
## iter  10 value 68.443172
## iter  20 value 57.670551
## iter  30 value 54.130447
## iter  40 value 52.388130
## iter  50 value 52.081329
## iter  60 value 50.360023
## iter  70 value 47.712259
## iter  80 value 47.360802
## iter  90 value 47.186792
## iter 100 value 47.181598
## final  value 47.181598 
## stopped after 100 iterations
## # weights:  81
## initial  value 152.574975 
## iter  10 value 98.026400
## iter  20 value 63.037002
## iter  30 value 50.526131
## iter  40 value 48.613876
## iter  50 value 47.301291
## iter  60 value 47.186663
## iter  70 value 47.175209
## iter  80 value 47.174849
## final  value 47.174845 
## converged
## # weights:  17
## initial  value 168.879285 
## iter  10 value 59.946522
## iter  20 value 54.922453
## iter  30 value 54.001010
## iter  40 value 51.844637
## iter  50 value 50.891587
## iter  60 value 50.153522
## iter  70 value 49.948202
## iter  80 value 49.907475
## iter  90 value 49.865884
## iter 100 value 49.856250
## final  value 49.856250 
## stopped after 100 iterations
## # weights:  49
## initial  value 176.223583 
## iter  10 value 94.634748
## iter  20 value 57.831728
## iter  30 value 43.704062
## iter  40 value 37.478511
## iter  50 value 28.135399
## iter  60 value 26.493888
## iter  70 value 25.981469
## iter  80 value 25.928826
## iter  90 value 25.902190
## iter 100 value 25.842225
## final  value 25.842225 
## stopped after 100 iterations
## # weights:  81
## initial  value 166.040886 
## iter  10 value 72.152590
## iter  20 value 35.529710
## iter  30 value 27.756188
## iter  40 value 27.604830
## iter  50 value 27.539129
## iter  60 value 27.473810
## iter  70 value 27.382993
## iter  80 value 27.341956
## iter  90 value 27.280554
## iter 100 value 27.251403
## final  value 27.251403 
## stopped after 100 iterations
## # weights:  17
## initial  value 171.406275 
## iter  10 value 91.472234
## iter  20 value 68.092321
## iter  30 value 65.573470
## iter  40 value 65.469557
## iter  50 value 65.467542
## iter  60 value 65.237227
## iter  70 value 65.197152
## iter  80 value 65.194047
## iter  90 value 65.193121
## iter 100 value 65.191443
## final  value 65.191443 
## stopped after 100 iterations
## # weights:  49
## initial  value 173.832255 
## iter  10 value 102.867686
## iter  20 value 94.842771
## iter  30 value 83.292224
## iter  40 value 79.007441
## iter  50 value 74.944539
## iter  60 value 73.900259
## iter  70 value 73.625596
## iter  80 value 71.403427
## iter  90 value 70.237087
## iter 100 value 69.515921
## final  value 69.515921 
## stopped after 100 iterations
## # weights:  81
## initial  value 199.279806 
## final  value 104.999997 
## converged
## # weights:  17
## initial  value 189.926502 
## iter  10 value 70.468246
## iter  20 value 67.044830
## iter  30 value 66.866360
## final  value 66.866274 
## converged
## # weights:  49
## initial  value 184.954339 
## iter  10 value 113.681615
## iter  20 value 68.690243
## iter  30 value 57.843124
## iter  40 value 50.179727
## iter  50 value 49.614172
## iter  60 value 49.610569
## final  value 49.610529 
## converged
## # weights:  81
## initial  value 189.507656 
## iter  10 value 75.798488
## iter  20 value 59.261751
## iter  30 value 54.058081
## iter  40 value 49.309792
## iter  50 value 48.778302
## iter  60 value 48.600482
## iter  70 value 47.973862
## iter  80 value 46.838094
## iter  90 value 46.611857
## iter 100 value 46.593694
## final  value 46.593694 
## stopped after 100 iterations
## # weights:  17
## initial  value 177.490491 
## iter  10 value 102.616713
## iter  20 value 100.380956
## iter  30 value 99.457737
## iter  40 value 98.451057
## iter  50 value 97.674395
## iter  60 value 97.170133
## iter  70 value 95.094259
## iter  80 value 91.012425
## iter  90 value 90.951998
## iter 100 value 90.916647
## final  value 90.916647 
## stopped after 100 iterations
## # weights:  49
## initial  value 177.393665 
## iter  10 value 92.095866
## iter  20 value 85.252777
## iter  30 value 78.457606
## iter  40 value 78.261732
## iter  50 value 78.241288
## iter  60 value 77.230266
## iter  70 value 77.204002
## iter  80 value 76.438122
## iter  90 value 76.160275
## iter 100 value 72.734113
## final  value 72.734113 
## stopped after 100 iterations
## # weights:  81
## initial  value 213.923624 
## iter  10 value 80.753844
## iter  20 value 72.925409
## iter  30 value 71.232586
## iter  40 value 70.262325
## iter  50 value 70.235997
## iter  60 value 65.689836
## iter  70 value 63.198826
## iter  80 value 61.161191
## iter  90 value 60.250233
## iter 100 value 59.638469
## final  value 59.638469 
## stopped after 100 iterations
## # weights:  17
## initial  value 170.756502 
## iter  10 value 100.697024
## iter  20 value 95.494263
## iter  30 value 92.413857
## iter  40 value 91.470080
## iter  50 value 89.465076
## iter  60 value 87.854469
## iter  70 value 87.452769
## iter  80 value 87.422793
## iter  90 value 86.624286
## iter 100 value 86.592292
## final  value 86.592292 
## stopped after 100 iterations
## # weights:  49
## initial  value 195.499587 
## final  value 122.000000 
## converged
## # weights:  81
## initial  value 173.235721 
## iter  10 value 100.802085
## iter  20 value 97.812203
## iter  30 value 97.800041
## final  value 97.800007 
## converged
## # weights:  17
## initial  value 176.316827 
## iter  10 value 88.719571
## iter  20 value 61.678287
## iter  30 value 60.197140
## final  value 60.172412 
## converged
## # weights:  49
## initial  value 198.137834 
## iter  10 value 71.151561
## iter  20 value 55.947905
## iter  30 value 52.881260
## iter  40 value 51.847493
## iter  50 value 49.631376
## iter  60 value 48.150734
## iter  70 value 47.986085
## iter  80 value 47.967279
## iter  90 value 47.966529
## iter 100 value 47.966387
## final  value 47.966387 
## stopped after 100 iterations
## # weights:  81
## initial  value 187.926697 
## iter  10 value 93.403717
## iter  20 value 58.482268
## iter  30 value 53.635058
## iter  40 value 51.413249
## iter  50 value 49.251021
## iter  60 value 47.706025
## iter  70 value 46.067377
## iter  80 value 45.461595
## iter  90 value 45.367933
## iter 100 value 45.355548
## final  value 45.355548 
## stopped after 100 iterations
## # weights:  17
## initial  value 171.652612 
## iter  10 value 106.486570
## iter  20 value 89.866912
## iter  30 value 87.393994
## iter  40 value 78.999863
## iter  50 value 75.892381
## iter  60 value 74.285870
## iter  70 value 73.491544
## iter  80 value 73.447788
## iter  90 value 72.672735
## iter 100 value 71.301577
## final  value 71.301577 
## stopped after 100 iterations
## # weights:  49
## initial  value 170.830212 
## iter  10 value 121.652668
## iter  20 value 70.090868
## iter  30 value 41.403236
## iter  40 value 37.310784
## iter  50 value 35.067607
## iter  60 value 32.472690
## iter  70 value 32.300367
## iter  80 value 32.203988
## iter  90 value 32.102550
## iter 100 value 31.106709
## final  value 31.106709 
## stopped after 100 iterations
## # weights:  81
## initial  value 217.967477 
## iter  10 value 101.347821
## iter  20 value 100.201263
## iter  30 value 90.031871
## iter  40 value 87.265155
## iter  50 value 82.276031
## iter  60 value 82.274613
## iter  70 value 82.271607
## iter  80 value 81.264988
## iter  90 value 81.263893
## iter 100 value 81.262777
## final  value 81.262777 
## stopped after 100 iterations
## # weights:  17
## initial  value 171.043550 
## iter  10 value 92.050205
## iter  20 value 82.199706
## iter  30 value 69.827590
## iter  40 value 68.410670
## iter  50 value 67.910280
## iter  60 value 67.879931
## final  value 67.878473 
## converged
## # weights:  49
## initial  value 184.467038 
## iter  10 value 94.923409
## iter  20 value 85.249070
## iter  30 value 81.238634
## iter  40 value 73.845199
## iter  50 value 63.240741
## iter  60 value 61.751479
## iter  70 value 61.383496
## iter  80 value 60.843936
## iter  90 value 60.340420
## iter 100 value 60.271749
## final  value 60.271749 
## stopped after 100 iterations
## # weights:  81
## initial  value 175.965355 
## iter  10 value 80.157047
## iter  20 value 76.622530
## iter  30 value 70.924151
## iter  40 value 70.006848
## iter  50 value 69.902970
## iter  60 value 68.057271
## iter  70 value 66.069343
## iter  80 value 65.978009
## iter  90 value 65.064287
## iter 100 value 64.052600
## final  value 64.052600 
## stopped after 100 iterations
## # weights:  17
## initial  value 179.747257 
## iter  10 value 91.357432
## iter  20 value 67.279188
## iter  30 value 66.381354
## final  value 66.376991 
## converged
## # weights:  49
## initial  value 179.845541 
## iter  10 value 72.499895
## iter  20 value 52.876214
## iter  30 value 50.247971
## iter  40 value 49.645158
## iter  50 value 49.456885
## iter  60 value 49.307581
## iter  70 value 49.299517
## final  value 49.299138 
## converged
## # weights:  81
## initial  value 182.111708 
## iter  10 value 83.238362
## iter  20 value 60.841817
## iter  30 value 55.078277
## iter  40 value 53.005311
## iter  50 value 51.489390
## iter  60 value 48.178213
## iter  70 value 46.266772
## iter  80 value 45.572374
## iter  90 value 45.093840
## iter 100 value 44.928443
## final  value 44.928443 
## stopped after 100 iterations
## # weights:  17
## initial  value 173.735924 
## iter  10 value 83.758946
## iter  20 value 74.544342
## iter  30 value 66.225768
## iter  40 value 65.364924
## iter  50 value 64.098492
## iter  60 value 63.312142
## iter  70 value 63.149119
## iter  80 value 62.325494
## iter  90 value 62.176753
## iter 100 value 62.167876
## final  value 62.167876 
## stopped after 100 iterations
## # weights:  49
## initial  value 178.713316 
## iter  10 value 102.319901
## iter  20 value 101.371949
## iter  30 value 101.300344
## iter  40 value 89.690920
## iter  50 value 84.626439
## iter  60 value 83.255953
## iter  70 value 81.289013
## iter  80 value 77.268498
## iter  90 value 77.242859
## iter 100 value 77.225969
## final  value 77.225969 
## stopped after 100 iterations
## # weights:  81
## initial  value 171.333853 
## iter  10 value 115.280223
## iter  20 value 113.397705
## iter  30 value 107.982664
## iter  40 value 96.073518
## iter  50 value 85.010560
## iter  60 value 78.213028
## iter  70 value 77.062489
## iter  80 value 77.017623
## iter  90 value 76.986118
## iter 100 value 76.911716
## final  value 76.911716 
## stopped after 100 iterations
## # weights:  17
## initial  value 183.287533 
## iter  10 value 113.999573
## iter  20 value 113.000036
## final  value 113.000001 
## converged
## # weights:  49
## initial  value 196.966170 
## iter  10 value 90.897115
## iter  20 value 73.973602
## iter  30 value 71.115950
## iter  40 value 70.963200
## iter  50 value 70.080007
## iter  60 value 67.721468
## iter  70 value 66.786165
## iter  80 value 66.308313
## iter  90 value 65.864232
## iter 100 value 62.909217
## final  value 62.909217 
## stopped after 100 iterations
## # weights:  81
## initial  value 189.784649 
## final  value 101.999996 
## converged
## # weights:  17
## initial  value 177.633784 
## iter  10 value 86.456725
## iter  20 value 72.277121
## iter  30 value 64.798271
## iter  40 value 64.546452
## iter  50 value 64.541392
## final  value 64.541334 
## converged
## # weights:  49
## initial  value 161.798500 
## iter  10 value 63.472949
## iter  20 value 55.902367
## iter  30 value 54.473815
## iter  40 value 54.120739
## iter  50 value 53.707047
## iter  60 value 53.658031
## iter  70 value 53.416099
## iter  80 value 53.349029
## iter  90 value 53.347744
## final  value 53.347715 
## converged
## # weights:  81
## initial  value 177.185209 
## iter  10 value 72.924817
## iter  20 value 55.598401
## iter  30 value 51.888227
## iter  40 value 49.829847
## iter  50 value 48.748997
## iter  60 value 47.803138
## iter  70 value 46.781233
## iter  80 value 46.594446
## iter  90 value 46.560274
## iter 100 value 46.559525
## final  value 46.559525 
## stopped after 100 iterations
## # weights:  17
## initial  value 177.232157 
## iter  10 value 96.820694
## iter  20 value 81.570638
## iter  30 value 79.770573
## iter  40 value 78.837100
## iter  50 value 78.829746
## iter  60 value 78.811483
## iter  70 value 78.784679
## iter  80 value 78.770162
## iter  90 value 78.765519
## iter 100 value 78.757383
## final  value 78.757383 
## stopped after 100 iterations
## # weights:  49
## initial  value 195.610850 
## iter  10 value 116.123065
## iter  20 value 112.534994
## iter  30 value 112.213003
## iter  40 value 112.186023
## iter  50 value 110.159747
## iter  60 value 103.175461
## iter  70 value 102.339946
## iter  80 value 101.577600
## iter  90 value 100.713974
## iter 100 value 95.587448
## final  value 95.587448 
## stopped after 100 iterations
## # weights:  81
## initial  value 176.228070 
## iter  10 value 95.134723
## iter  20 value 87.332636
## iter  30 value 86.503964
## iter  40 value 86.126283
## iter  50 value 81.178301
## iter  60 value 80.056564
## iter  70 value 74.151750
## iter  80 value 73.159560
## iter  90 value 72.183846
## iter 100 value 65.584234
## final  value 65.584234 
## stopped after 100 iterations
## # weights:  17
## initial  value 181.592125 
## iter  10 value 65.478258
## iter  20 value 53.872456
## iter  30 value 51.306356
## iter  40 value 47.660762
## iter  50 value 47.005309
## iter  60 value 46.200691
## iter  70 value 45.985506
## iter  80 value 43.618121
## iter  90 value 43.545511
## final  value 43.545500 
## converged
## # weights:  49
## initial  value 181.638273 
## final  value 99.999918 
## converged
## # weights:  81
## initial  value 175.387629 
## iter  10 value 93.535400
## iter  20 value 75.785026
## iter  30 value 72.152892
## iter  40 value 70.842468
## iter  50 value 70.051946
## iter  60 value 70.000571
## iter  70 value 69.006896
## iter  80 value 69.000572
## iter  90 value 69.000139
## final  value 69.000098 
## converged
## # weights:  17
## initial  value 171.310362 
## iter  10 value 76.664273
## iter  20 value 62.670649
## iter  30 value 61.259054
## final  value 61.258087 
## converged
## # weights:  49
## initial  value 181.879817 
## iter  10 value 84.904825
## iter  20 value 61.407303
## iter  30 value 53.951516
## iter  40 value 53.219832
## iter  50 value 53.164831
## iter  60 value 53.159045
## final  value 53.158894 
## converged
## # weights:  81
## initial  value 176.619339 
## iter  10 value 56.962629
## iter  20 value 47.893927
## iter  30 value 46.636597
## iter  40 value 42.956020
## iter  50 value 41.387146
## iter  60 value 40.886678
## iter  70 value 40.707594
## iter  80 value 40.669642
## final  value 40.668402 
## converged
## # weights:  17
## initial  value 174.435524 
## iter  10 value 94.145418
## iter  20 value 80.150682
## iter  30 value 79.147483
## iter  40 value 76.146975
## iter  50 value 67.549121
## iter  60 value 63.177561
## iter  70 value 60.965526
## iter  80 value 60.304550
## iter  90 value 58.341039
## iter 100 value 58.310797
## final  value 58.310797 
## stopped after 100 iterations
## # weights:  49
## initial  value 197.439687 
## iter  10 value 94.580183
## iter  20 value 72.519063
## iter  30 value 53.091037
## iter  40 value 41.430676
## iter  50 value 39.049785
## iter  60 value 36.941079
## iter  70 value 35.842455
## iter  80 value 35.429573
## iter  90 value 34.844912
## iter 100 value 33.981722
## final  value 33.981722 
## stopped after 100 iterations
## # weights:  81
## initial  value 159.572643 
## iter  10 value 98.462078
## iter  20 value 96.580174
## iter  30 value 96.562886
## iter  40 value 96.443756
## iter  50 value 95.428170
## iter  60 value 91.552483
## iter  70 value 89.395757
## iter  80 value 88.328178
## iter  90 value 86.373612
## iter 100 value 85.322680
## final  value 85.322680 
## stopped after 100 iterations
## # weights:  17
## initial  value 167.362383 
## iter  10 value 114.812097
## iter  20 value 100.313074
## iter  30 value 92.779246
## iter  40 value 91.977263
## iter  50 value 91.906012
## iter  60 value 91.093099
## iter  70 value 91.084716
## iter  80 value 90.200040
## iter  90 value 90.174378
## iter 100 value 87.339375
## final  value 87.339375 
## stopped after 100 iterations
## # weights:  49
## initial  value 191.346277 
## iter  10 value 90.555752
## iter  20 value 78.552758
## iter  30 value 69.132893
## iter  40 value 67.049099
## iter  50 value 65.348430
## iter  60 value 59.017722
## iter  70 value 57.089016
## iter  80 value 56.611934
## iter  90 value 54.713346
## iter 100 value 53.806704
## final  value 53.806704 
## stopped after 100 iterations
## # weights:  81
## initial  value 187.959065 
## final  value 108.999984 
## converged
## # weights:  17
## initial  value 175.655779 
## iter  10 value 72.973438
## iter  20 value 66.798807
## iter  30 value 66.552608
## final  value 66.552523 
## converged
## # weights:  49
## initial  value 175.506909 
## iter  10 value 73.748432
## iter  20 value 64.988556
## iter  30 value 59.076417
## iter  40 value 56.291528
## iter  50 value 53.998818
## iter  60 value 51.335818
## iter  70 value 50.848824
## iter  80 value 50.728284
## iter  90 value 50.526037
## iter 100 value 50.491121
## final  value 50.491121 
## stopped after 100 iterations
## # weights:  81
## initial  value 161.180507 
## iter  10 value 70.654005
## iter  20 value 55.410911
## iter  30 value 50.606859
## iter  40 value 48.901079
## iter  50 value 47.533244
## iter  60 value 47.352452
## iter  70 value 47.322238
## iter  80 value 47.304868
## iter  90 value 47.302420
## iter 100 value 47.302349
## final  value 47.302349 
## stopped after 100 iterations
## # weights:  17
## initial  value 172.995610 
## iter  10 value 89.677289
## iter  20 value 82.666853
## iter  30 value 68.134974
## iter  40 value 68.004935
## iter  50 value 67.975699
## iter  60 value 67.965883
## iter  70 value 67.964213
## iter  80 value 67.960229
## iter  90 value 67.958250
## iter 100 value 67.923183
## final  value 67.923183 
## stopped after 100 iterations
## # weights:  49
## initial  value 198.249966 
## iter  10 value 85.874388
## iter  20 value 73.670774
## iter  30 value 67.339517
## iter  40 value 66.220155
## iter  50 value 65.099604
## iter  60 value 64.367001
## iter  70 value 61.793457
## iter  80 value 61.172629
## iter  90 value 55.499296
## iter 100 value 55.191648
## final  value 55.191648 
## stopped after 100 iterations
## # weights:  81
## initial  value 177.592743 
## iter  10 value 96.093389
## iter  20 value 80.390430
## iter  30 value 78.491160
## iter  40 value 78.390596
## iter  50 value 78.356179
## iter  60 value 78.104857
## iter  70 value 77.631184
## iter  80 value 77.573751
## iter  90 value 77.202495
## iter 100 value 76.481574
## final  value 76.481574 
## stopped after 100 iterations
## # weights:  17
## initial  value 171.406665 
## iter  10 value 69.332427
## iter  20 value 60.915753
## iter  30 value 57.469910
## iter  40 value 56.607302
## iter  50 value 55.856639
## iter  60 value 54.564654
## iter  70 value 52.917403
## iter  80 value 52.827803
## iter  90 value 52.811941
## iter 100 value 52.808693
## final  value 52.808693 
## stopped after 100 iterations
## # weights:  49
## initial  value 207.100103 
## iter  10 value 110.000308
## final  value 110.000001 
## converged
## # weights:  81
## initial  value 166.313058 
## iter  10 value 93.092156
## iter  20 value 87.804768
## iter  30 value 75.882294
## iter  40 value 68.534637
## iter  50 value 67.585195
## iter  60 value 64.543462
## iter  70 value 61.751751
## iter  80 value 61.541596
## iter  90 value 61.541324
## iter 100 value 61.540950
## final  value 61.540950 
## stopped after 100 iterations
## # weights:  17
## initial  value 183.886545 
## iter  10 value 116.448522
## iter  20 value 76.940895
## iter  30 value 63.682097
## iter  40 value 63.266187
## iter  50 value 63.248820
## final  value 63.248576 
## converged
## # weights:  49
## initial  value 186.021304 
## iter  10 value 92.803887
## iter  20 value 66.382593
## iter  30 value 57.468089
## iter  40 value 55.027265
## iter  50 value 53.941152
## iter  60 value 53.113324
## iter  70 value 52.915554
## iter  80 value 52.892889
## iter  90 value 52.892670
## final  value 52.892665 
## converged
## # weights:  81
## initial  value 176.044319 
## iter  10 value 58.806861
## iter  20 value 47.141683
## iter  30 value 45.074083
## iter  40 value 44.648748
## iter  50 value 44.589615
## iter  60 value 44.557637
## iter  70 value 44.035750
## iter  80 value 43.919666
## iter  90 value 43.910585
## iter 100 value 43.909922
## final  value 43.909922 
## stopped after 100 iterations
## # weights:  17
## initial  value 182.655839 
## iter  10 value 67.090219
## iter  20 value 58.696907
## iter  30 value 57.286977
## iter  40 value 56.339902
## iter  50 value 55.864835
## iter  60 value 54.778676
## iter  70 value 54.308925
## iter  80 value 54.299293
## iter  90 value 54.286841
## iter 100 value 54.279432
## final  value 54.279432 
## stopped after 100 iterations
## # weights:  49
## initial  value 170.480915 
## iter  10 value 103.443559
## iter  20 value 72.710512
## iter  30 value 68.436293
## iter  40 value 59.875955
## iter  50 value 56.018838
## iter  60 value 55.397061
## iter  70 value 53.774633
## iter  80 value 52.618110
## iter  90 value 50.795917
## iter 100 value 49.689117
## final  value 49.689117 
## stopped after 100 iterations
## # weights:  81
## initial  value 189.510035 
## iter  10 value 103.613660
## iter  20 value 85.663633
## iter  30 value 82.411835
## iter  40 value 78.352086
## iter  50 value 77.864162
## iter  60 value 77.393499
## iter  70 value 76.187585
## iter  80 value 74.688208
## iter  90 value 74.439278
## iter 100 value 74.381268
## final  value 74.381268 
## stopped after 100 iterations
## # weights:  17
## initial  value 193.000674 
## iter  10 value 79.435809
## iter  20 value 70.335092
## iter  30 value 65.439878
## iter  40 value 62.587034
## iter  50 value 60.594915
## iter  60 value 59.921570
## iter  70 value 59.229973
## iter  80 value 58.444879
## iter  90 value 58.381579
## iter 100 value 58.343417
## final  value 58.343417 
## stopped after 100 iterations
## # weights:  49
## initial  value 180.188184 
## final  value 118.999991 
## converged
## # weights:  81
## initial  value 193.273834 
## iter  10 value 80.562966
## iter  20 value 73.973970
## iter  30 value 72.151362
## iter  40 value 71.990759
## iter  50 value 71.929476
## iter  60 value 57.703120
## iter  70 value 51.733706
## iter  80 value 48.224170
## iter  90 value 46.810690
## iter 100 value 45.932024
## final  value 45.932024 
## stopped after 100 iterations
## # weights:  17
## initial  value 173.397154 
## iter  10 value 84.650137
## iter  20 value 74.880565
## iter  30 value 70.579074
## iter  40 value 65.918888
## iter  50 value 65.176814
## final  value 65.172965 
## converged
## # weights:  49
## initial  value 171.512269 
## iter  10 value 82.150176
## iter  20 value 65.351352
## iter  30 value 59.004505
## iter  40 value 56.847031
## iter  50 value 50.272857
## iter  60 value 49.213996
## iter  70 value 49.079288
## iter  80 value 49.049600
## iter  90 value 49.011284
## iter 100 value 48.987749
## final  value 48.987749 
## stopped after 100 iterations
## # weights:  81
## initial  value 167.547822 
## iter  10 value 88.203595
## iter  20 value 57.034462
## iter  30 value 52.359706
## iter  40 value 51.263365
## iter  50 value 50.470069
## iter  60 value 49.556156
## iter  70 value 49.322606
## iter  80 value 49.294241
## iter  90 value 49.293073
## final  value 49.293056 
## converged
## # weights:  17
## initial  value 174.326071 
## iter  10 value 102.239630
## iter  20 value 88.973419
## iter  30 value 86.038829
## iter  40 value 80.732605
## iter  50 value 80.621963
## iter  60 value 79.711179
## iter  70 value 79.695319
## iter  80 value 78.774336
## iter  90 value 78.765835
## iter 100 value 78.754763
## final  value 78.754763 
## stopped after 100 iterations
## # weights:  49
## initial  value 182.505538 
## iter  10 value 98.522424
## iter  20 value 62.729165
## iter  30 value 45.072459
## iter  40 value 39.544253
## iter  50 value 38.894151
## iter  60 value 37.118056
## iter  70 value 35.949456
## iter  80 value 35.313520
## iter  90 value 35.192651
## iter 100 value 35.086222
## final  value 35.086222 
## stopped after 100 iterations
## # weights:  81
## initial  value 195.188294 
## iter  10 value 119.263583
## iter  20 value 119.254506
## iter  30 value 119.242464
## iter  40 value 64.302002
## iter  50 value 43.085984
## iter  60 value 34.475925
## iter  70 value 29.509648
## iter  80 value 28.508511
## iter  90 value 28.263357
## iter 100 value 28.096125
## final  value 28.096125 
## stopped after 100 iterations
## # weights:  17
## initial  value 169.728233 
## iter  10 value 86.009838
## iter  20 value 79.975684
## iter  30 value 78.218758
## iter  40 value 77.517459
## iter  50 value 76.410354
## iter  60 value 75.895475
## iter  70 value 75.268219
## iter  80 value 65.231157
## iter  90 value 64.954985
## iter 100 value 64.046042
## final  value 64.046042 
## stopped after 100 iterations
## # weights:  49
## initial  value 181.104480 
## iter  10 value 99.403564
## iter  20 value 74.358549
## iter  30 value 68.407088
## iter  40 value 64.034103
## iter  50 value 61.694719
## iter  60 value 60.868481
## iter  70 value 60.724978
## iter  80 value 60.240529
## iter  90 value 59.935910
## iter 100 value 59.639234
## final  value 59.639234 
## stopped after 100 iterations
## # weights:  81
## initial  value 191.804990 
## iter  10 value 91.812749
## iter  20 value 72.272917
## iter  30 value 61.336611
## iter  40 value 59.157607
## iter  50 value 57.788686
## iter  60 value 54.756200
## iter  70 value 54.081633
## iter  80 value 53.995334
## iter  90 value 53.773689
## iter 100 value 52.461788
## final  value 52.461788 
## stopped after 100 iterations
## # weights:  17
## initial  value 170.879014 
## iter  10 value 74.783998
## iter  20 value 68.035462
## iter  30 value 67.818860
## final  value 67.818219 
## converged
## # weights:  49
## initial  value 189.589513 
## iter  10 value 99.686717
## iter  20 value 67.597771
## iter  30 value 64.506732
## iter  40 value 60.595165
## iter  50 value 58.194500
## iter  60 value 57.656386
## iter  70 value 57.220370
## iter  80 value 57.043384
## iter  90 value 56.936031
## iter 100 value 56.929679
## final  value 56.929679 
## stopped after 100 iterations
## # weights:  81
## initial  value 184.700114 
## iter  10 value 73.289358
## iter  20 value 56.364874
## iter  30 value 53.434787
## iter  40 value 47.291422
## iter  50 value 45.790732
## iter  60 value 45.756744
## iter  70 value 45.756128
## final  value 45.756119 
## converged
## # weights:  17
## initial  value 178.881175 
## iter  10 value 119.357687
## iter  20 value 117.369588
## iter  30 value 117.291944
## iter  40 value 112.059800
## iter  50 value 100.898052
## iter  60 value 88.184396
## iter  70 value 81.601703
## iter  80 value 80.932926
## iter  90 value 79.389991
## iter 100 value 79.273184
## final  value 79.273184 
## stopped after 100 iterations
## # weights:  49
## initial  value 180.033235 
## iter  10 value 110.574337
## iter  20 value 107.755594
## iter  30 value 107.266444
## iter  40 value 105.273931
## iter  50 value 103.396137
## iter  60 value 101.270390
## iter  70 value 99.251086
## iter  80 value 96.861527
## iter  90 value 96.169846
## iter 100 value 95.177828
## final  value 95.177828 
## stopped after 100 iterations
## # weights:  81
## initial  value 158.183317 
## iter  10 value 93.121746
## iter  20 value 86.876263
## iter  30 value 86.082652
## iter  40 value 84.278203
## iter  50 value 83.301829
## iter  60 value 83.237518
## iter  70 value 82.321297
## iter  80 value 82.248107
## iter  90 value 82.224868
## iter 100 value 82.189874
## final  value 82.189874 
## stopped after 100 iterations
## # weights:  17
## initial  value 179.008256 
## iter  10 value 75.145038
## iter  20 value 71.107564
## iter  30 value 69.171181
## iter  40 value 69.167016
## iter  50 value 64.361829
## iter  60 value 62.441224
## iter  70 value 61.733698
## iter  80 value 60.783172
## iter  90 value 59.903069
## iter 100 value 59.829612
## final  value 59.829612 
## stopped after 100 iterations
## # weights:  49
## initial  value 177.747140 
## iter  10 value 96.493705
## iter  20 value 74.692992
## iter  30 value 73.961907
## iter  40 value 72.560568
## iter  50 value 70.729418
## iter  60 value 70.559615
## iter  70 value 70.380319
## iter  80 value 70.299105
## iter  90 value 70.254817
## iter 100 value 70.118241
## final  value 70.118241 
## stopped after 100 iterations
## # weights:  81
## initial  value 174.805562 
## iter  10 value 96.951502
## iter  20 value 83.873344
## iter  30 value 79.206497
## iter  40 value 77.372279
## iter  50 value 76.376320
## iter  60 value 74.576152
## iter  70 value 74.309572
## iter  80 value 73.306192
## iter  90 value 73.295084
## iter 100 value 72.743681
## final  value 72.743681 
## stopped after 100 iterations
## # weights:  17
## initial  value 173.557817 
## iter  10 value 72.574253
## iter  20 value 59.858040
## iter  30 value 58.956791
## iter  40 value 58.890242
## iter  50 value 58.890030
## final  value 58.890029 
## converged
## # weights:  49
## initial  value 179.250690 
## iter  10 value 72.824029
## iter  20 value 59.088036
## iter  30 value 55.635658
## iter  40 value 54.109578
## iter  50 value 53.669009
## iter  60 value 53.054029
## iter  70 value 52.896477
## iter  80 value 52.893967
## final  value 52.893959 
## converged
## # weights:  81
## initial  value 175.476038 
## iter  10 value 76.686283
## iter  20 value 52.764162
## iter  30 value 48.002387
## iter  40 value 44.813480
## iter  50 value 44.011716
## iter  60 value 42.620878
## iter  70 value 42.329287
## iter  80 value 42.282225
## iter  90 value 42.245163
## iter 100 value 42.238796
## final  value 42.238796 
## stopped after 100 iterations
## # weights:  17
## initial  value 189.896341 
## iter  10 value 73.732245
## iter  20 value 64.552092
## iter  30 value 64.265598
## iter  40 value 64.142333
## iter  50 value 64.134848
## iter  60 value 64.132764
## iter  70 value 64.121438
## iter  80 value 64.113167
## iter  90 value 63.933340
## iter 100 value 58.256253
## final  value 58.256253 
## stopped after 100 iterations
## # weights:  49
## initial  value 186.260537 
## iter  10 value 79.683086
## iter  20 value 72.233239
## iter  30 value 71.246736
## iter  40 value 69.675633
## iter  50 value 69.254512
## iter  60 value 69.237994
## iter  70 value 69.221109
## iter  80 value 68.239490
## iter  90 value 68.187535
## iter 100 value 68.180113
## final  value 68.180113 
## stopped after 100 iterations
## # weights:  81
## initial  value 146.016393 
## iter  10 value 88.341479
## iter  20 value 80.298978
## iter  30 value 77.277672
## iter  40 value 72.306098
## iter  50 value 72.214511
## iter  60 value 69.793017
## iter  70 value 69.270588
## iter  80 value 69.240468
## iter  90 value 69.213669
## iter 100 value 69.184276
## final  value 69.184276 
## stopped after 100 iterations
## # weights:  17
## initial  value 182.742960 
## iter  10 value 101.606581
## iter  20 value 94.061636
## iter  30 value 92.278167
## iter  40 value 92.247949
## iter  50 value 92.102505
## iter  60 value 90.230296
## iter  70 value 88.167150
## iter  80 value 86.294664
## iter  90 value 84.702691
## iter 100 value 84.690033
## final  value 84.690033 
## stopped after 100 iterations
## # weights:  49
## initial  value 190.338090 
## iter  10 value 114.998708
## final  value 113.999984 
## converged
## # weights:  81
## initial  value 161.487830 
## iter  10 value 84.502375
## iter  20 value 77.565854
## iter  30 value 74.618212
## iter  40 value 74.011470
## iter  50 value 72.963224
## iter  60 value 69.017825
## iter  70 value 65.437553
## iter  80 value 64.064385
## iter  90 value 63.034962
## iter 100 value 63.003445
## final  value 63.003445 
## stopped after 100 iterations
## # weights:  17
## initial  value 171.584947 
## iter  10 value 96.694753
## iter  20 value 77.288492
## iter  30 value 70.202176
## iter  40 value 65.158096
## iter  50 value 65.109349
## final  value 65.108127 
## converged
## # weights:  49
## initial  value 183.357667 
## iter  10 value 68.946090
## iter  20 value 59.241134
## iter  30 value 56.332280
## iter  40 value 55.598127
## iter  50 value 53.751095
## iter  60 value 53.383251
## iter  70 value 52.850634
## iter  80 value 52.718420
## iter  90 value 52.710193
## final  value 52.710130 
## converged
## # weights:  81
## initial  value 193.041333 
## iter  10 value 88.963957
## iter  20 value 71.224302
## iter  30 value 61.649402
## iter  40 value 56.103278
## iter  50 value 53.634205
## iter  60 value 52.488870
## iter  70 value 52.056813
## iter  80 value 51.762911
## iter  90 value 51.121100
## iter 100 value 50.037840
## final  value 50.037840 
## stopped after 100 iterations
## # weights:  17
## initial  value 172.562078 
## iter  10 value 71.338990
## iter  20 value 61.064155
## iter  30 value 58.971035
## iter  40 value 56.428167
## iter  50 value 55.764172
## iter  60 value 55.601504
## iter  70 value 54.506582
## iter  80 value 54.498587
## iter  90 value 54.497334
## iter 100 value 54.497065
## final  value 54.497065 
## stopped after 100 iterations
## # weights:  49
## initial  value 199.793180 
## iter  10 value 81.397481
## iter  20 value 78.278831
## iter  30 value 77.276121
## iter  40 value 76.827027
## iter  50 value 74.698928
## iter  60 value 74.085359
## iter  70 value 71.740428
## iter  80 value 70.150746
## iter  90 value 68.221143
## iter 100 value 68.103238
## final  value 68.103238 
## stopped after 100 iterations
## # weights:  81
## initial  value 175.160295 
## iter  10 value 96.897195
## iter  20 value 63.524076
## iter  30 value 55.823076
## iter  40 value 53.175002
## iter  50 value 51.627680
## iter  60 value 51.436148
## iter  70 value 43.882837
## iter  80 value 40.610053
## iter  90 value 39.891654
## iter 100 value 39.062351
## final  value 39.062351 
## stopped after 100 iterations
## # weights:  17
## initial  value 178.679255 
## iter  10 value 111.175022
## iter  20 value 93.987485
## iter  30 value 91.936029
## iter  40 value 85.879972
## iter  50 value 84.704546
## iter  60 value 84.540269
## iter  70 value 77.704275
## iter  80 value 77.692560
## iter  90 value 77.690398
## iter 100 value 77.219958
## final  value 77.219958 
## stopped after 100 iterations
## # weights:  49
## initial  value 198.776960 
## iter  10 value 91.612358
## iter  20 value 87.835579
## iter  30 value 79.118889
## iter  40 value 73.084813
## iter  50 value 68.095850
## iter  60 value 67.899062
## iter  70 value 64.981283
## iter  80 value 64.852802
## iter  90 value 64.004463
## iter 100 value 63.111482
## final  value 63.111482 
## stopped after 100 iterations
## # weights:  81
## initial  value 201.735799 
## iter  10 value 102.700286
## iter  20 value 75.076499
## iter  30 value 71.995754
## iter  40 value 70.986710
## iter  50 value 70.942599
## iter  60 value 70.925799
## iter  70 value 70.875847
## iter  80 value 69.808442
## iter  90 value 69.616384
## iter 100 value 67.084379
## final  value 67.084379 
## stopped after 100 iterations
## # weights:  17
## initial  value 172.257489 
## iter  10 value 80.748671
## iter  20 value 64.501938
## iter  30 value 63.202886
## final  value 63.185001 
## converged
## # weights:  49
## initial  value 181.075049 
## iter  10 value 67.618673
## iter  20 value 55.879420
## iter  30 value 51.835260
## iter  40 value 51.091256
## iter  50 value 51.059529
## iter  60 value 51.045511
## final  value 51.045506 
## converged
## # weights:  81
## initial  value 195.895217 
## iter  10 value 109.763995
## iter  20 value 71.294547
## iter  30 value 52.425371
## iter  40 value 47.918530
## iter  50 value 45.216660
## iter  60 value 43.788015
## iter  70 value 43.162790
## iter  80 value 43.057980
## iter  90 value 43.049150
## iter 100 value 43.048925
## final  value 43.048925 
## stopped after 100 iterations
## # weights:  17
## initial  value 179.124037 
## iter  10 value 68.782691
## iter  20 value 56.108919
## iter  30 value 55.044800
## iter  40 value 53.196704
## iter  50 value 52.343043
## iter  60 value 51.435535
## iter  70 value 50.923904
## iter  80 value 50.548376
## iter  90 value 50.019312
## iter 100 value 49.970503
## final  value 49.970503 
## stopped after 100 iterations
## # weights:  49
## initial  value 171.874325 
## iter  10 value 88.822652
## iter  20 value 44.022332
## iter  30 value 25.856595
## iter  40 value 24.218729
## iter  50 value 24.161456
## iter  60 value 24.064750
## iter  70 value 23.878895
## iter  80 value 23.646760
## iter  90 value 23.491605
## iter 100 value 22.465641
## final  value 22.465641 
## stopped after 100 iterations
## # weights:  81
## initial  value 164.568500 
## iter  10 value 107.183239
## iter  20 value 102.138622
## iter  30 value 96.997692
## iter  40 value 91.037434
## iter  50 value 88.530968
## iter  60 value 86.905480
## iter  70 value 86.838623
## iter  80 value 86.350717
## iter  90 value 84.998841
## iter 100 value 84.205531
## final  value 84.205531 
## stopped after 100 iterations
## # weights:  17
## initial  value 172.072183 
## iter  10 value 93.785422
## iter  20 value 84.175495
## iter  30 value 82.951297
## iter  40 value 82.634163
## iter  50 value 82.618587
## iter  60 value 82.556031
## iter  70 value 82.542656
## iter  80 value 82.534807
## iter  90 value 82.532051
## iter 100 value 82.530772
## final  value 82.530772 
## stopped after 100 iterations
## # weights:  49
## initial  value 168.387829 
## iter  10 value 101.747170
## iter  20 value 86.112218
## iter  30 value 80.412067
## iter  40 value 79.187570
## iter  50 value 77.895893
## iter  60 value 76.363810
## iter  70 value 75.197850
## iter  80 value 73.434030
## iter  90 value 72.106411
## iter 100 value 72.078154
## final  value 72.078154 
## stopped after 100 iterations
## # weights:  81
## initial  value 181.487173 
## iter  10 value 84.971164
## iter  20 value 60.589932
## iter  30 value 56.101634
## iter  40 value 51.596441
## iter  50 value 47.403698
## iter  60 value 45.798036
## iter  70 value 45.189777
## iter  80 value 40.007925
## iter  90 value 36.685230
## iter 100 value 35.102977
## final  value 35.102977 
## stopped after 100 iterations
## # weights:  17
## initial  value 178.558423 
## iter  10 value 71.737412
## iter  20 value 64.972882
## iter  30 value 64.015764
## final  value 64.006591 
## converged
## # weights:  49
## initial  value 168.515120 
## iter  10 value 75.496919
## iter  20 value 57.683576
## iter  30 value 55.030934
## iter  40 value 51.293644
## iter  50 value 49.649891
## iter  60 value 48.153490
## iter  70 value 47.609398
## iter  80 value 47.597370
## final  value 47.597368 
## converged
## # weights:  81
## initial  value 187.585354 
## iter  10 value 67.713914
## iter  20 value 55.757477
## iter  30 value 53.552722
## iter  40 value 53.301612
## iter  50 value 52.986262
## iter  60 value 50.601659
## iter  70 value 50.253008
## iter  80 value 49.841068
## iter  90 value 49.571610
## iter 100 value 49.484627
## final  value 49.484627 
## stopped after 100 iterations
## # weights:  17
## initial  value 174.879336 
## iter  10 value 91.657866
## iter  20 value 77.690750
## iter  30 value 71.467563
## iter  40 value 68.297927
## iter  50 value 68.034626
## iter  60 value 67.911185
## iter  70 value 67.907149
## iter  80 value 67.903774
## iter  90 value 67.902186
## iter 100 value 67.901711
## final  value 67.901711 
## stopped after 100 iterations
## # weights:  49
## initial  value 168.672871 
## iter  10 value 116.258446
## iter  20 value 105.524149
## iter  30 value 93.105167
## iter  40 value 86.371251
## iter  50 value 84.526567
## iter  60 value 79.876188
## iter  70 value 76.756318
## iter  80 value 73.385223
## iter  90 value 72.570546
## iter 100 value 69.408009
## final  value 69.408009 
## stopped after 100 iterations
## # weights:  81
## initial  value 160.065272 
## iter  10 value 67.354120
## iter  20 value 38.608555
## iter  30 value 31.791862
## iter  40 value 30.564113
## iter  50 value 30.100069
## iter  60 value 28.971673
## iter  70 value 28.313625
## iter  80 value 27.061041
## iter  90 value 26.287177
## iter 100 value 25.712418
## final  value 25.712418 
## stopped after 100 iterations
## # weights:  17
## initial  value 172.592134 
## iter  10 value 99.625751
## iter  20 value 98.633305
## iter  30 value 97.665392
## iter  40 value 95.444969
## iter  50 value 91.840213
## iter  60 value 90.753384
## iter  70 value 90.729435
## iter  80 value 89.860197
## iter  90 value 89.713280
## iter 100 value 89.710571
## final  value 89.710571 
## stopped after 100 iterations
## # weights:  49
## initial  value 191.620159 
## iter  10 value 103.592280
## iter  20 value 85.698456
## iter  30 value 84.116943
## iter  40 value 82.756198
## iter  50 value 72.588388
## iter  60 value 67.954580
## iter  70 value 61.763608
## iter  80 value 59.741993
## iter  90 value 58.446661
## iter 100 value 57.409681
## final  value 57.409681 
## stopped after 100 iterations
## # weights:  81
## initial  value 166.131884 
## final  value 112.999999 
## converged
## # weights:  17
## initial  value 179.919662 
## iter  10 value 82.808248
## iter  20 value 74.194069
## iter  30 value 68.258848
## iter  40 value 63.583666
## final  value 63.576867 
## converged
## # weights:  49
## initial  value 170.120509 
## iter  10 value 66.487928
## iter  20 value 58.049269
## iter  30 value 52.501607
## iter  40 value 49.895196
## iter  50 value 49.265845
## iter  60 value 49.237698
## iter  70 value 49.237490
## final  value 49.237488 
## converged
## # weights:  81
## initial  value 196.742864 
## iter  10 value 94.622323
## iter  20 value 60.298834
## iter  30 value 51.512970
## iter  40 value 48.036352
## iter  50 value 46.955691
## iter  60 value 46.240349
## iter  70 value 45.950190
## iter  80 value 45.380090
## iter  90 value 44.247860
## iter 100 value 44.131664
## final  value 44.131664 
## stopped after 100 iterations
## # weights:  17
## initial  value 199.167654 
## iter  10 value 105.470724
## iter  20 value 93.983407
## iter  30 value 84.065713
## iter  40 value 76.536127
## iter  50 value 66.195557
## iter  60 value 65.687334
## iter  70 value 65.669748
## iter  80 value 65.666542
## iter  90 value 65.662293
## iter 100 value 65.657946
## final  value 65.657946 
## stopped after 100 iterations
## # weights:  49
## initial  value 169.882513 
## iter  10 value 109.363084
## iter  20 value 96.404907
## iter  30 value 91.234828
## iter  40 value 91.036715
## iter  50 value 90.024062
## iter  60 value 89.900710
## iter  70 value 88.669081
## iter  80 value 84.489782
## iter  90 value 84.388176
## iter 100 value 84.346000
## final  value 84.346000 
## stopped after 100 iterations
## # weights:  81
## initial  value 189.640529 
## iter  10 value 109.884303
## iter  20 value 92.484063
## iter  30 value 80.805631
## iter  40 value 77.313095
## iter  50 value 75.271144
## iter  60 value 72.092678
## iter  70 value 70.796795
## iter  80 value 67.985722
## iter  90 value 66.052408
## iter 100 value 63.169709
## final  value 63.169709 
## stopped after 100 iterations
## # weights:  17
## initial  value 192.306952 
## iter  10 value 95.662147
## iter  20 value 73.479583
## iter  30 value 70.913966
## iter  40 value 68.962296
## iter  50 value 67.695450
## iter  60 value 66.691113
## iter  70 value 66.597747
## iter  80 value 65.684082
## iter  90 value 65.549371
## iter 100 value 65.524198
## final  value 65.524198 
## stopped after 100 iterations
## # weights:  49
## initial  value 219.944326 
## iter  10 value 105.894979
## iter  20 value 91.991295
## iter  30 value 86.972693
## iter  40 value 85.276252
## iter  50 value 81.894356
## iter  60 value 80.850366
## iter  70 value 79.830974
## iter  80 value 79.815568
## iter  90 value 79.426410
## iter 100 value 77.925223
## final  value 77.925223 
## stopped after 100 iterations
## # weights:  81
## initial  value 185.101553 
## iter  10 value 78.294745
## iter  20 value 70.074779
## iter  30 value 62.184226
## iter  40 value 57.500041
## iter  50 value 52.708140
## iter  60 value 47.885403
## iter  70 value 47.436178
## iter  80 value 47.256098
## iter  90 value 46.850345
## iter 100 value 46.048368
## final  value 46.048368 
## stopped after 100 iterations
## # weights:  17
## initial  value 169.491794 
## iter  10 value 70.848923
## iter  20 value 63.266680
## iter  30 value 63.154283
## final  value 63.154229 
## converged
## # weights:  49
## initial  value 210.671569 
## iter  10 value 79.373787
## iter  20 value 63.549847
## iter  30 value 58.424389
## iter  40 value 56.944442
## iter  50 value 53.532693
## iter  60 value 51.871125
## iter  70 value 51.478329
## iter  80 value 51.468547
## final  value 51.468459 
## converged
## # weights:  81
## initial  value 211.859671 
## iter  10 value 63.285910
## iter  20 value 49.702331
## iter  30 value 46.526386
## iter  40 value 45.572893
## iter  50 value 45.126671
## iter  60 value 44.792015
## iter  70 value 44.714420
## iter  80 value 44.671689
## iter  90 value 44.670829
## final  value 44.670826 
## converged
## # weights:  17
## initial  value 176.951392 
## iter  10 value 84.226294
## iter  20 value 64.607696
## iter  30 value 54.929183
## iter  40 value 53.559752
## iter  50 value 53.225195
## iter  60 value 53.093383
## iter  70 value 52.956676
## final  value 52.940522 
## converged
## # weights:  49
## initial  value 186.222718 
## iter  10 value 97.374004
## iter  20 value 97.270023
## iter  30 value 88.679398
## iter  40 value 85.711096
## iter  50 value 82.271904
## iter  60 value 81.728928
## iter  70 value 81.579817
## iter  80 value 79.432652
## iter  90 value 77.919717
## iter 100 value 74.470643
## final  value 74.470643 
## stopped after 100 iterations
## # weights:  81
## initial  value 189.011444 
## iter  10 value 106.589751
## iter  20 value 99.343371
## iter  30 value 92.197175
## iter  40 value 85.013658
## iter  50 value 75.715067
## iter  60 value 74.175065
## iter  70 value 74.045708
## iter  80 value 73.171110
## iter  90 value 73.159656
## iter 100 value 73.140923
## final  value 73.140923 
## stopped after 100 iterations
## # weights:  17
## initial  value 207.301208 
## iter  10 value 66.373670
## iter  20 value 60.286169
## iter  30 value 59.300641
## iter  40 value 58.351574
## iter  50 value 58.190796
## iter  60 value 57.568385
## iter  70 value 57.517155
## iter  80 value 57.514182
## iter  90 value 57.513977
## iter 100 value 57.513924
## final  value 57.513924 
## stopped after 100 iterations
## # weights:  49
## initial  value 183.590527 
## iter  10 value 109.301824
## iter  20 value 106.634144
## iter  30 value 102.514844
## iter  40 value 84.170884
## iter  50 value 75.142303
## iter  60 value 67.137646
## iter  70 value 61.654159
## iter  80 value 61.371866
## iter  90 value 52.998159
## iter 100 value 51.000742
## final  value 51.000742 
## stopped after 100 iterations
## # weights:  81
## initial  value 181.488296 
## final  value 117.000000 
## converged
## # weights:  17
## initial  value 191.348153 
## iter  10 value 112.691535
## iter  20 value 75.581675
## iter  30 value 64.202085
## iter  40 value 62.824348
## final  value 62.794951 
## converged
## # weights:  49
## initial  value 175.205034 
## iter  10 value 69.942884
## iter  20 value 61.105847
## iter  30 value 58.565369
## iter  40 value 54.087853
## iter  50 value 53.866728
## iter  60 value 52.939720
## iter  70 value 51.002677
## iter  80 value 49.181145
## iter  90 value 48.834667
## iter 100 value 48.765235
## final  value 48.765235 
## stopped after 100 iterations
## # weights:  81
## initial  value 185.075591 
## iter  10 value 74.790249
## iter  20 value 55.853763
## iter  30 value 51.030267
## iter  40 value 48.633315
## iter  50 value 46.895278
## iter  60 value 46.147770
## iter  70 value 45.730047
## iter  80 value 45.521307
## iter  90 value 44.877851
## iter 100 value 44.746787
## final  value 44.746787 
## stopped after 100 iterations
## # weights:  17
## initial  value 183.320334 
## iter  10 value 88.127110
## iter  20 value 87.110717
## iter  30 value 87.107363
## iter  40 value 87.102913
## iter  50 value 87.096536
## iter  60 value 87.086281
## iter  70 value 87.066224
## iter  80 value 87.006305
## iter  90 value 85.136814
## iter 100 value 72.239246
## final  value 72.239246 
## stopped after 100 iterations
## # weights:  49
## initial  value 196.112358 
## iter  10 value 75.680236
## iter  20 value 65.048062
## iter  30 value 60.759922
## iter  40 value 53.071203
## iter  50 value 43.216300
## iter  60 value 37.680885
## iter  70 value 35.877451
## iter  80 value 35.433554
## iter  90 value 35.078061
## iter 100 value 34.945747
## final  value 34.945747 
## stopped after 100 iterations
## # weights:  81
## initial  value 154.354739 
## iter  10 value 84.469644
## iter  20 value 74.734589
## iter  30 value 66.587104
## iter  40 value 56.391673
## iter  50 value 51.401367
## iter  60 value 49.551876
## iter  70 value 46.096485
## iter  80 value 42.631508
## iter  90 value 41.225288
## iter 100 value 40.427053
## final  value 40.427053 
## stopped after 100 iterations
## # weights:  17
## initial  value 173.645833 
## iter  10 value 75.880016
## iter  20 value 71.160696
## iter  30 value 70.247485
## iter  40 value 69.985340
## iter  50 value 69.973577
## iter  60 value 69.904683
## iter  70 value 69.764882
## iter  80 value 69.730695
## iter  90 value 69.726708
## iter 100 value 69.725156
## final  value 69.725156 
## stopped after 100 iterations
## # weights:  49
## initial  value 195.635840 
## iter  10 value 95.714243
## iter  20 value 73.863749
## iter  30 value 72.319830
## iter  40 value 69.900850
## iter  50 value 66.099637
## iter  60 value 65.926263
## iter  70 value 65.881536
## iter  80 value 64.884042
## iter  90 value 55.983275
## iter 100 value 55.934041
## final  value 55.934041 
## stopped after 100 iterations
## # weights:  81
## initial  value 178.751324 
## iter  10 value 71.779264
## iter  20 value 66.022553
## iter  30 value 65.980936
## iter  40 value 64.166334
## iter  50 value 62.077803
## iter  60 value 62.004161
## iter  70 value 61.002015
## iter  80 value 60.002970
## iter  90 value 59.779165
## iter 100 value 58.893504
## final  value 58.893504 
## stopped after 100 iterations
## # weights:  17
## initial  value 171.211449 
## iter  10 value 84.934714
## iter  20 value 67.959222
## iter  30 value 63.890975
## iter  40 value 61.414823
## iter  50 value 61.156021
## final  value 61.155087 
## converged
## # weights:  49
## initial  value 188.951838 
## iter  10 value 69.305616
## iter  20 value 59.621113
## iter  30 value 55.688978
## iter  40 value 54.976779
## iter  50 value 54.898093
## iter  60 value 54.897336
## final  value 54.897297 
## converged
## # weights:  81
## initial  value 160.119448 
## iter  10 value 70.323362
## iter  20 value 53.392514
## iter  30 value 50.749473
## iter  40 value 48.853307
## iter  50 value 48.130574
## iter  60 value 47.986605
## iter  70 value 47.972247
## iter  80 value 47.963495
## iter  90 value 47.699673
## iter 100 value 47.545108
## final  value 47.545108 
## stopped after 100 iterations
## # weights:  17
## initial  value 167.463170 
## iter  10 value 89.084058
## iter  20 value 87.197376
## iter  30 value 84.071166
## iter  40 value 82.508118
## iter  50 value 78.638002
## iter  60 value 72.098018
## iter  70 value 71.879394
## iter  80 value 71.850430
## iter  90 value 71.840884
## iter 100 value 71.821381
## final  value 71.821381 
## stopped after 100 iterations
## # weights:  49
## initial  value 223.025936 
## iter  10 value 88.515862
## iter  20 value 86.183194
## iter  30 value 84.659140
## iter  40 value 81.360965
## iter  50 value 75.849013
## iter  60 value 72.424388
## iter  70 value 69.980710
## iter  80 value 69.325698
## iter  90 value 68.455215
## iter 100 value 67.490698
## final  value 67.490698 
## stopped after 100 iterations
## # weights:  81
## initial  value 179.075968 
## iter  10 value 82.490218
## iter  20 value 71.404311
## iter  30 value 70.355129
## iter  40 value 69.355859
## iter  50 value 69.326103
## iter  60 value 69.320738
## iter  70 value 68.317650
## iter  80 value 68.253359
## iter  90 value 67.919627
## iter 100 value 66.731332
## final  value 66.731332 
## stopped after 100 iterations
## # weights:  17
## initial  value 187.398943 
## iter  10 value 99.067816
## iter  20 value 83.412649
## iter  30 value 80.133682
## iter  40 value 79.996140
## iter  50 value 79.990368
## iter  60 value 78.981743
## iter  70 value 77.975300
## iter  80 value 77.009011
## iter  90 value 76.814149
## iter 100 value 76.706255
## final  value 76.706255 
## stopped after 100 iterations
## # weights:  49
## initial  value 191.203213 
## iter  10 value 115.993271
## iter  20 value 115.992234
## iter  30 value 115.990821
## iter  40 value 115.988784
## iter  50 value 115.985595
## iter  60 value 115.979906
## iter  70 value 115.966977
## iter  80 value 115.911411
## iter  90 value 101.476353
## iter 100 value 100.250916
## final  value 100.250916 
## stopped after 100 iterations
## # weights:  81
## initial  value 176.942806 
## iter  10 value 97.843817
## iter  20 value 92.321687
## iter  30 value 88.914718
## iter  40 value 87.984690
## iter  50 value 86.658735
## iter  60 value 85.584971
## iter  70 value 85.277321
## iter  80 value 84.291696
## iter  90 value 82.839432
## iter 100 value 81.567306
## final  value 81.567306 
## stopped after 100 iterations
## # weights:  17
## initial  value 186.471259 
## iter  10 value 85.024825
## iter  20 value 72.634503
## iter  30 value 63.997527
## iter  40 value 63.581680
## iter  50 value 63.579336
## final  value 63.579328 
## converged
## # weights:  49
## initial  value 169.500713 
## iter  10 value 76.629919
## iter  20 value 64.358245
## iter  30 value 60.985424
## iter  40 value 56.381087
## iter  50 value 55.320317
## iter  60 value 54.395847
## iter  70 value 52.228950
## iter  80 value 50.845870
## iter  90 value 49.867503
## iter 100 value 49.842581
## final  value 49.842581 
## stopped after 100 iterations
## # weights:  81
## initial  value 202.476336 
## iter  10 value 72.010272
## iter  20 value 58.601571
## iter  30 value 54.710997
## iter  40 value 53.047054
## iter  50 value 51.549027
## iter  60 value 51.170374
## iter  70 value 49.870931
## iter  80 value 49.132848
## iter  90 value 48.951812
## iter 100 value 48.904696
## final  value 48.904696 
## stopped after 100 iterations
## # weights:  17
## initial  value 175.451052 
## iter  10 value 83.058261
## iter  20 value 81.418166
## iter  30 value 80.054118
## iter  40 value 78.102761
## iter  50 value 77.196490
## iter  60 value 77.069022
## iter  70 value 77.058996
## iter  80 value 77.051364
## iter  90 value 77.050087
## iter 100 value 77.047391
## final  value 77.047391 
## stopped after 100 iterations
## # weights:  49
## initial  value 184.066919 
## iter  10 value 88.609108
## iter  20 value 83.202954
## iter  30 value 79.229640
## iter  40 value 78.223622
## iter  50 value 77.209937
## iter  60 value 75.098498
## iter  70 value 68.838952
## iter  80 value 62.129535
## iter  90 value 56.631658
## iter 100 value 54.005960
## final  value 54.005960 
## stopped after 100 iterations
## # weights:  81
## initial  value 172.514941 
## iter  10 value 93.356242
## iter  20 value 80.641716
## iter  30 value 77.461104
## iter  40 value 76.417345
## iter  50 value 75.367647
## iter  60 value 74.417278
## iter  70 value 73.410274
## iter  80 value 72.376233
## iter  90 value 72.251456
## iter 100 value 71.133241
## final  value 71.133241 
## stopped after 100 iterations
## # weights:  17
## initial  value 175.657623 
## iter  10 value 71.604382
## iter  20 value 59.105671
## iter  30 value 57.380387
## iter  40 value 56.414040
## iter  50 value 55.567821
## iter  60 value 52.433654
## iter  70 value 50.693611
## iter  80 value 50.674105
## iter  90 value 50.673826
## final  value 50.673821 
## converged
## # weights:  49
## initial  value 186.852635 
## iter  10 value 115.999623
## iter  20 value 115.999609
## iter  30 value 115.999591
## iter  40 value 115.999568
## iter  50 value 115.999534
## iter  60 value 115.999481
## iter  70 value 115.999376
## iter  80 value 115.999017
## iter  90 value 114.999734
## final  value 114.999573 
## converged
## # weights:  81
## initial  value 195.378609 
## iter  10 value 98.565078
## iter  20 value 88.950292
## iter  30 value 87.998660
## iter  40 value 87.997793
## iter  50 value 87.997275
## iter  60 value 87.995624
## iter  70 value 86.997617
## iter  80 value 86.986137
## iter  90 value 85.659677
## iter 100 value 78.513480
## final  value 78.513480 
## stopped after 100 iterations
## # weights:  17
## initial  value 171.558622 
## iter  10 value 81.670883
## iter  20 value 74.276748
## iter  30 value 68.554343
## iter  40 value 65.677217
## iter  50 value 65.173509
## final  value 65.166574 
## converged
## # weights:  49
## initial  value 193.465361 
## iter  10 value 119.458120
## iter  20 value 66.267546
## iter  30 value 58.112427
## iter  40 value 51.525289
## iter  50 value 50.237213
## iter  60 value 49.769317
## iter  70 value 49.652288
## iter  80 value 49.648714
## iter  90 value 49.648674
## iter  90 value 49.648673
## iter  90 value 49.648673
## final  value 49.648673 
## converged
## # weights:  81
## initial  value 173.377104 
## iter  10 value 77.710943
## iter  20 value 58.676711
## iter  30 value 53.301086
## iter  40 value 50.547685
## iter  50 value 49.440565
## iter  60 value 48.515717
## iter  70 value 46.633395
## iter  80 value 46.351258
## iter  90 value 46.340869
## iter 100 value 46.340679
## final  value 46.340679 
## stopped after 100 iterations
## # weights:  17
## initial  value 183.074463 
## iter  10 value 104.071559
## iter  20 value 102.000792
## iter  30 value 100.783668
## iter  40 value 87.224292
## iter  50 value 84.589883
## iter  60 value 83.254915
## iter  70 value 81.888408
## iter  80 value 81.776317
## iter  90 value 81.752270
## iter 100 value 80.849127
## final  value 80.849127 
## stopped after 100 iterations
## # weights:  49
## initial  value 178.396991 
## iter  10 value 106.969217
## iter  20 value 95.500335
## iter  30 value 94.436916
## iter  40 value 94.427420
## iter  50 value 94.174518
## iter  60 value 93.181700
## iter  70 value 92.236152
## iter  80 value 91.156394
## iter  90 value 91.046723
## iter 100 value 89.940695
## final  value 89.940695 
## stopped after 100 iterations
## # weights:  81
## initial  value 201.382590 
## iter  10 value 104.022301
## iter  20 value 98.324420
## iter  30 value 98.281558
## iter  40 value 95.433546
## iter  50 value 92.204134
## iter  60 value 90.280949
## iter  70 value 89.197577
## iter  80 value 89.075320
## iter  90 value 84.615270
## iter 100 value 83.565797
## final  value 83.565797 
## stopped after 100 iterations
## # weights:  17
## initial  value 190.336242 
## iter  10 value 90.009266
## iter  20 value 88.277813
## iter  30 value 87.115435
## iter  40 value 85.047441
## iter  50 value 84.290766
## iter  60 value 84.013241
## iter  70 value 84.003836
## iter  80 value 83.010307
## iter  90 value 83.008123
## iter 100 value 83.006044
## final  value 83.006044 
## stopped after 100 iterations
## # weights:  49
## initial  value 160.401269 
## iter  10 value 112.000505
## iter  20 value 111.999109
## iter  30 value 108.992549
## iter  40 value 108.800107
## iter  50 value 108.799702
## iter  60 value 107.063230
## iter  70 value 102.973400
## iter  80 value 101.332294
## iter  90 value 101.178998
## iter 100 value 101.161687
## final  value 101.161687 
## stopped after 100 iterations
## # weights:  81
## initial  value 194.126083 
## iter  10 value 80.679446
## iter  20 value 75.315304
## iter  30 value 71.279511
## iter  40 value 70.529731
## iter  50 value 69.705700
## iter  60 value 68.998069
## iter  70 value 66.999944
## iter  80 value 66.965221
## iter  90 value 66.739777
## iter 100 value 58.714513
## final  value 58.714513 
## stopped after 100 iterations
## # weights:  17
## initial  value 175.589118 
## iter  10 value 81.667366
## iter  20 value 75.618821
## iter  30 value 66.207909
## iter  40 value 65.951785
## iter  50 value 65.939658
## final  value 65.939553 
## converged
## # weights:  49
## initial  value 173.138881 
## iter  10 value 83.018436
## iter  20 value 63.157009
## iter  30 value 56.911930
## iter  40 value 54.662109
## iter  50 value 53.912913
## iter  60 value 53.602118
## iter  70 value 53.577633
## iter  80 value 53.575408
## final  value 53.575400 
## converged
## # weights:  81
## initial  value 162.427178 
## iter  10 value 79.672889
## iter  20 value 61.024523
## iter  30 value 52.975248
## iter  40 value 49.192357
## iter  50 value 47.888735
## iter  60 value 47.498354
## iter  70 value 47.479953
## iter  80 value 47.475037
## final  value 47.474800 
## converged
## # weights:  17
## initial  value 190.759738 
## iter  10 value 104.158952
## iter  20 value 92.275065
## iter  30 value 89.168829
## iter  40 value 87.202206
## iter  50 value 87.157743
## iter  60 value 87.118593
## iter  70 value 73.791429
## iter  80 value 71.014806
## iter  90 value 67.496849
## iter 100 value 66.384273
## final  value 66.384273 
## stopped after 100 iterations
## # weights:  49
## initial  value 183.563893 
## iter  10 value 107.285633
## iter  20 value 96.831062
## iter  30 value 92.807631
## iter  40 value 85.200333
## iter  50 value 82.248761
## iter  60 value 82.156230
## iter  70 value 82.114985
## iter  80 value 82.048337
## iter  90 value 76.602796
## iter 100 value 75.870244
## final  value 75.870244 
## stopped after 100 iterations
## # weights:  81
## initial  value 169.767186 
## iter  10 value 119.190339
## iter  20 value 107.262992
## iter  30 value 95.183260
## iter  40 value 85.914493
## iter  50 value 82.101174
## iter  60 value 79.689781
## iter  70 value 79.625704
## iter  80 value 78.634893
## iter  90 value 68.525760
## iter 100 value 63.415286
## final  value 63.415286 
## stopped after 100 iterations
## # weights:  17
## initial  value 180.628894 
## final  value 108.999997 
## converged
## # weights:  49
## initial  value 166.100775 
## iter  10 value 83.605340
## iter  20 value 75.123014
## iter  30 value 74.738197
## iter  40 value 71.888074
## iter  50 value 69.543423
## iter  60 value 69.285254
## iter  70 value 68.012842
## iter  80 value 67.668823
## iter  90 value 66.586843
## iter 100 value 66.366948
## final  value 66.366948 
## stopped after 100 iterations
## # weights:  81
## initial  value 185.358361 
## iter  10 value 101.988755
## final  value 95.000001 
## converged
## # weights:  17
## initial  value 163.585335 
## iter  10 value 83.494214
## iter  20 value 72.126381
## iter  30 value 62.299472
## iter  40 value 61.311142
## final  value 61.295697 
## converged
## # weights:  49
## initial  value 184.277871 
## iter  10 value 73.623756
## iter  20 value 59.455315
## iter  30 value 54.621712
## iter  40 value 52.718984
## iter  50 value 52.491889
## iter  60 value 52.483655
## final  value 52.483439 
## converged
## # weights:  81
## initial  value 205.815924 
## iter  10 value 69.803170
## iter  20 value 52.730147
## iter  30 value 47.744643
## iter  40 value 46.036771
## iter  50 value 44.752320
## iter  60 value 43.938336
## iter  70 value 43.778503
## iter  80 value 43.677376
## iter  90 value 43.454885
## iter 100 value 42.858678
## final  value 42.858678 
## stopped after 100 iterations
## # weights:  17
## initial  value 173.438791 
## iter  10 value 85.837670
## iter  20 value 80.198692
## iter  30 value 69.384482
## iter  40 value 68.479305
## iter  50 value 67.827094
## iter  60 value 66.489316
## iter  70 value 65.964117
## iter  80 value 65.842343
## iter  90 value 65.803430
## iter 100 value 65.130330
## final  value 65.130330 
## stopped after 100 iterations
## # weights:  49
## initial  value 150.017774 
## iter  10 value 59.950794
## iter  20 value 41.338958
## iter  30 value 34.611049
## iter  40 value 33.290139
## iter  50 value 32.559691
## iter  60 value 32.379812
## iter  70 value 32.254080
## iter  80 value 32.182197
## iter  90 value 32.144874
## iter 100 value 32.083578
## final  value 32.083578 
## stopped after 100 iterations
## # weights:  81
## initial  value 166.484864 
## iter  10 value 73.652622
## iter  20 value 62.242060
## iter  30 value 61.009795
## iter  40 value 60.121697
## iter  50 value 58.992627
## iter  60 value 57.980616
## iter  70 value 57.947639
## iter  80 value 54.200989
## iter  90 value 54.007461
## iter 100 value 53.991429
## final  value 53.991429 
## stopped after 100 iterations
## # weights:  17
## initial  value 160.724131 
## final  value 118.000000 
## converged
## # weights:  49
## initial  value 165.017142 
## final  value 120.000050 
## converged
## # weights:  81
## initial  value 233.174848 
## iter  10 value 95.704070
## iter  20 value 93.989653
## iter  30 value 93.981033
## iter  40 value 92.983507
## iter  50 value 92.968427
## iter  60 value 92.808731
## iter  70 value 86.283848
## iter  80 value 82.164939
## iter  90 value 81.903064
## iter 100 value 79.025499
## final  value 79.025499 
## stopped after 100 iterations
## # weights:  17
## initial  value 172.175885 
## iter  10 value 89.228892
## iter  20 value 75.223995
## iter  30 value 66.712820
## iter  40 value 64.317206
## final  value 64.283240 
## converged
## # weights:  49
## initial  value 211.829835 
## iter  10 value 65.483280
## iter  20 value 58.095718
## iter  30 value 54.252061
## iter  40 value 52.578703
## iter  50 value 51.751153
## iter  60 value 51.393708
## iter  70 value 51.340694
## iter  80 value 51.310647
## iter  90 value 51.247522
## iter 100 value 51.008339
## final  value 51.008339 
## stopped after 100 iterations
## # weights:  81
## initial  value 218.996302 
## iter  10 value 113.249092
## iter  20 value 68.862777
## iter  30 value 55.996824
## iter  40 value 53.136916
## iter  50 value 50.989458
## iter  60 value 46.877068
## iter  70 value 45.987400
## iter  80 value 45.745555
## iter  90 value 45.590569
## iter 100 value 45.586274
## final  value 45.586274 
## stopped after 100 iterations
## # weights:  17
## initial  value 172.713437 
## iter  10 value 98.408597
## iter  20 value 88.649300
## iter  30 value 87.358688
## iter  40 value 85.264200
## iter  50 value 85.152570
## iter  60 value 84.751217
## iter  70 value 81.750382
## iter  80 value 80.819061
## iter  90 value 80.586547
## iter 100 value 79.857368
## final  value 79.857368 
## stopped after 100 iterations
## # weights:  49
## initial  value 202.789309 
## iter  10 value 89.679421
## iter  20 value 82.543268
## iter  30 value 78.924850
## iter  40 value 76.958918
## iter  50 value 76.934378
## iter  60 value 76.106316
## iter  70 value 75.986885
## iter  80 value 75.972902
## iter  90 value 75.014041
## iter 100 value 75.003138
## final  value 75.003138 
## stopped after 100 iterations
## # weights:  81
## initial  value 201.846077 
## iter  10 value 94.422353
## iter  20 value 93.420902
## iter  30 value 93.417210
## iter  40 value 93.406088
## iter  50 value 93.000188
## iter  60 value 90.487606
## iter  70 value 80.199016
## iter  80 value 78.390987
## iter  90 value 75.636117
## iter 100 value 72.214285
## final  value 72.214285 
## stopped after 100 iterations
## # weights:  17
## initial  value 177.018334 
## iter  10 value 96.017855
## iter  20 value 95.996585
## iter  30 value 87.000263
## iter  40 value 86.999546
## iter  50 value 81.922178
## iter  60 value 81.000002
## final  value 80.999991 
## converged
## # weights:  49
## initial  value 170.994846 
## iter  10 value 78.836280
## iter  20 value 71.568728
## iter  30 value 67.230966
## iter  40 value 64.753272
## iter  50 value 62.994277
## iter  60 value 62.044455
## iter  70 value 61.341906
## iter  80 value 60.484250
## iter  90 value 60.189929
## iter 100 value 60.041808
## final  value 60.041808 
## stopped after 100 iterations
## # weights:  81
## initial  value 181.021314 
## final  value 128.000000 
## converged
## # weights:  17
## initial  value 177.479235 
## iter  10 value 102.635751
## iter  20 value 72.626377
## iter  30 value 63.891469
## iter  40 value 63.706245
## iter  50 value 63.705138
## iter  50 value 63.705137
## iter  50 value 63.705137
## final  value 63.705137 
## converged
## # weights:  49
## initial  value 200.836658 
## iter  10 value 73.112954
## iter  20 value 57.296030
## iter  30 value 53.837878
## iter  40 value 52.554542
## iter  50 value 52.271059
## iter  60 value 52.116438
## iter  70 value 52.115235
## final  value 52.115222 
## converged
## # weights:  81
## initial  value 199.236319 
## iter  10 value 92.881345
## iter  20 value 74.386269
## iter  30 value 57.710924
## iter  40 value 52.461811
## iter  50 value 47.569724
## iter  60 value 44.957588
## iter  70 value 43.780100
## iter  80 value 43.235401
## iter  90 value 42.871718
## iter 100 value 42.276413
## final  value 42.276413 
## stopped after 100 iterations
## # weights:  17
## initial  value 176.370101 
## iter  10 value 100.927829
## iter  20 value 90.566089
## iter  30 value 83.230612
## iter  40 value 80.972392
## iter  50 value 78.423569
## iter  60 value 78.396917
## iter  70 value 78.389615
## iter  80 value 78.375774
## iter  90 value 78.365475
## iter 100 value 77.618738
## final  value 77.618738 
## stopped after 100 iterations
## # weights:  49
## initial  value 177.078257 
## iter  10 value 89.384132
## iter  20 value 85.632521
## iter  30 value 85.407221
## iter  40 value 84.566559
## iter  50 value 82.491769
## iter  60 value 81.403035
## iter  70 value 80.432439
## iter  80 value 80.172188
## iter  90 value 79.572345
## iter 100 value 76.684886
## final  value 76.684886 
## stopped after 100 iterations
## # weights:  81
## initial  value 191.698154 
## iter  10 value 123.279760
## iter  20 value 123.273900
## iter  30 value 108.032194
## iter  40 value 107.177535
## iter  50 value 105.408171
## iter  60 value 105.396482
## iter  70 value 105.394517
## iter  80 value 105.392391
## iter  90 value 105.389282
## iter 100 value 102.488418
## final  value 102.488418 
## stopped after 100 iterations
## # weights:  17
## initial  value 212.195236 
## iter  10 value 109.644758
## iter  20 value 89.451273
## iter  30 value 77.892853
## iter  40 value 77.725196
## iter  50 value 76.925943
## iter  60 value 76.819738
## iter  70 value 76.802622
## iter  80 value 76.798158
## iter  90 value 76.795355
## iter 100 value 76.791410
## final  value 76.791410 
## stopped after 100 iterations
## # weights:  49
## initial  value 173.226456 
## final  value 115.999986 
## converged
## # weights:  81
## initial  value 208.552598 
## iter  10 value 71.926801
## iter  20 value 50.374039
## iter  30 value 41.921174
## iter  40 value 40.354616
## iter  50 value 38.367031
## iter  60 value 36.367199
## iter  70 value 35.561563
## iter  80 value 34.004166
## iter  90 value 33.143639
## iter 100 value 32.706404
## final  value 32.706404 
## stopped after 100 iterations
## # weights:  17
## initial  value 171.903345 
## iter  10 value 73.831466
## iter  20 value 65.303671
## iter  30 value 64.363274
## final  value 64.363168 
## converged
## # weights:  49
## initial  value 204.664227 
## iter  10 value 101.151948
## iter  20 value 64.533699
## iter  30 value 56.546193
## iter  40 value 55.679733
## iter  50 value 55.451386
## iter  60 value 49.711290
## iter  70 value 49.279632
## iter  80 value 49.270070
## iter  90 value 49.269986
## iter  90 value 49.269986
## iter  90 value 49.269986
## final  value 49.269986 
## converged
## # weights:  81
## initial  value 217.047950 
## iter  10 value 116.562144
## iter  20 value 72.474913
## iter  30 value 55.536291
## iter  40 value 51.375358
## iter  50 value 48.737960
## iter  60 value 47.948534
## iter  70 value 47.351906
## iter  80 value 45.555243
## iter  90 value 43.868514
## iter 100 value 43.665791
## final  value 43.665791 
## stopped after 100 iterations
## # weights:  17
## initial  value 175.436876 
## iter  10 value 91.948794
## iter  20 value 65.562379
## iter  30 value 62.708342
## iter  40 value 61.279564
## iter  50 value 61.148305
## iter  60 value 61.117219
## iter  70 value 61.086230
## iter  80 value 61.077269
## iter  90 value 61.075830
## iter 100 value 61.074657
## final  value 61.074657 
## stopped after 100 iterations
## # weights:  49
## initial  value 172.037114 
## iter  10 value 106.489788
## iter  20 value 102.239443
## iter  30 value 101.413575
## iter  40 value 100.617053
## iter  50 value 96.589597
## iter  60 value 93.011893
## iter  70 value 77.239130
## iter  80 value 60.815161
## iter  90 value 60.128703
## iter 100 value 55.627716
## final  value 55.627716 
## stopped after 100 iterations
## # weights:  81
## initial  value 175.663165 
## iter  10 value 92.630547
## iter  20 value 83.035931
## iter  30 value 80.860338
## iter  40 value 79.742544
## iter  50 value 78.429552
## iter  60 value 78.125039
## iter  70 value 75.241944
## iter  80 value 61.754954
## iter  90 value 60.243539
## iter 100 value 59.663527
## final  value 59.663527 
## stopped after 100 iterations
## # weights:  17
## initial  value 171.598470 
## iter  10 value 100.978033
## iter  20 value 97.626540
## iter  30 value 93.600578
## iter  40 value 87.891871
## iter  50 value 86.398006
## iter  60 value 84.558082
## iter  70 value 83.760907
## iter  80 value 83.252915
## iter  90 value 83.023710
## iter 100 value 82.789640
## final  value 82.789640 
## stopped after 100 iterations
## # weights:  49
## initial  value 198.733874 
## iter  10 value 87.455528
## iter  20 value 69.420599
## iter  30 value 66.509718
## iter  40 value 64.006203
## iter  50 value 63.999926
## final  value 63.999811 
## converged
## # weights:  81
## initial  value 168.582420 
## iter  10 value 105.828305
## iter  20 value 87.398584
## iter  30 value 86.459700
## iter  40 value 85.800312
## iter  50 value 85.800038
## final  value 85.800036 
## converged
## # weights:  17
## initial  value 172.241772 
## iter  10 value 89.307976
## iter  20 value 65.042667
## iter  30 value 63.566036
## iter  40 value 63.479689
## iter  50 value 63.478433
## final  value 63.478426 
## converged
## # weights:  49
## initial  value 183.081581 
## iter  10 value 68.606249
## iter  20 value 59.119410
## iter  30 value 56.123174
## iter  40 value 54.771064
## iter  50 value 54.684654
## iter  60 value 54.682477
## final  value 54.682456 
## converged
## # weights:  81
## initial  value 193.099835 
## iter  10 value 72.685864
## iter  20 value 57.215628
## iter  30 value 52.725647
## iter  40 value 48.600730
## iter  50 value 47.436639
## iter  60 value 46.662376
## iter  70 value 45.811503
## iter  80 value 45.498362
## iter  90 value 45.049495
## iter 100 value 44.967879
## final  value 44.967879 
## stopped after 100 iterations
## # weights:  17
## initial  value 191.683707 
## iter  10 value 88.190388
## iter  20 value 83.650766
## iter  30 value 83.173166
## iter  40 value 83.171559
## iter  50 value 83.165821
## iter  60 value 82.198718
## iter  70 value 82.159903
## iter  80 value 79.274709
## iter  90 value 76.809318
## iter 100 value 76.140461
## final  value 76.140461 
## stopped after 100 iterations
## # weights:  49
## initial  value 196.684142 
## iter  10 value 75.293198
## iter  20 value 59.311818
## iter  30 value 49.548710
## iter  40 value 40.489549
## iter  50 value 36.519573
## iter  60 value 35.489430
## iter  70 value 35.283528
## iter  80 value 35.187042
## iter  90 value 35.149377
## iter 100 value 35.116050
## final  value 35.116050 
## stopped after 100 iterations
## # weights:  81
## initial  value 206.174585 
## iter  10 value 116.688940
## iter  20 value 115.123133
## iter  30 value 112.363388
## iter  40 value 110.857979
## iter  50 value 107.496096
## iter  60 value 105.646256
## iter  70 value 105.461480
## iter  80 value 104.452996
## iter  90 value 103.414909
## iter 100 value 103.331902
## final  value 103.331902 
## stopped after 100 iterations
## # weights:  17
## initial  value 191.645923 
## iter  10 value 82.127393
## iter  20 value 71.735780
## iter  30 value 64.954345
## iter  40 value 64.545081
## iter  50 value 63.703168
## iter  60 value 63.601899
## iter  70 value 63.540379
## iter  80 value 63.511849
## iter  90 value 63.500114
## iter 100 value 63.496840
## final  value 63.496840 
## stopped after 100 iterations
## # weights:  49
## initial  value 180.803001 
## iter  10 value 90.444022
## iter  20 value 80.830310
## iter  30 value 69.367250
## iter  40 value 66.075838
## iter  50 value 65.888656
## iter  60 value 65.020491
## iter  70 value 65.011455
## iter  80 value 65.005403
## iter  90 value 65.000848
## iter 100 value 64.996849
## final  value 64.996849 
## stopped after 100 iterations
## # weights:  81
## initial  value 173.383684 
## iter  10 value 101.751146
## iter  20 value 91.999948
## iter  30 value 91.000313
## iter  40 value 91.000083
## iter  50 value 90.999923
## iter  60 value 90.999498
## iter  70 value 90.996774
## iter  80 value 90.553389
## iter  90 value 87.001423
## iter 100 value 86.999480
## final  value 86.999480 
## stopped after 100 iterations
## # weights:  17
## initial  value 160.817726 
## iter  10 value 102.180512
## iter  20 value 66.438111
## iter  30 value 64.149006
## iter  40 value 64.039540
## iter  50 value 64.039018
## iter  50 value 64.039018
## iter  50 value 64.039018
## final  value 64.039018 
## converged
## # weights:  49
## initial  value 178.258157 
## iter  10 value 74.184370
## iter  20 value 61.391358
## iter  30 value 56.891179
## iter  40 value 54.734473
## iter  50 value 54.239443
## iter  60 value 54.162649
## iter  70 value 54.157973
## final  value 54.157804 
## converged
## # weights:  81
## initial  value 187.463367 
## iter  10 value 64.121743
## iter  20 value 55.074441
## iter  30 value 50.345913
## iter  40 value 49.535019
## iter  50 value 48.943633
## iter  60 value 48.784252
## iter  70 value 45.737253
## iter  80 value 44.803473
## iter  90 value 44.769283
## iter 100 value 44.768858
## final  value 44.768858 
## stopped after 100 iterations
## # weights:  17
## initial  value 174.399442 
## iter  10 value 108.608919
## iter  20 value 105.988435
## iter  30 value 102.640029
## iter  40 value 102.083361
## iter  50 value 98.266454
## iter  60 value 91.260006
## iter  70 value 84.532473
## iter  80 value 84.453428
## iter  90 value 79.589683
## iter 100 value 78.604146
## final  value 78.604146 
## stopped after 100 iterations
## # weights:  49
## initial  value 189.143827 
## iter  10 value 79.967263
## iter  20 value 76.074327
## iter  30 value 74.101025
## iter  40 value 70.916402
## iter  50 value 68.764222
## iter  60 value 64.522554
## iter  70 value 61.737633
## iter  80 value 56.056447
## iter  90 value 53.989374
## iter 100 value 53.269744
## final  value 53.269744 
## stopped after 100 iterations
## # weights:  81
## initial  value 157.114206 
## iter  10 value 116.788848
## iter  20 value 113.000423
## iter  30 value 101.008417
## iter  40 value 95.381601
## iter  50 value 92.763810
## iter  60 value 90.993085
## iter  70 value 90.400091
## iter  80 value 90.139641
## iter  90 value 87.398998
## iter 100 value 86.120368
## final  value 86.120368 
## stopped after 100 iterations
## # weights:  81
## initial  value 190.386439 
## iter  10 value 83.005890
## iter  20 value 61.874274
## iter  30 value 56.699084
## iter  40 value 54.851690
## iter  50 value 53.100887
## iter  60 value 52.700258
## iter  70 value 52.583348
## iter  80 value 52.348912
## iter  90 value 52.240948
## iter 100 value 52.239237
## final  value 52.239237 
## stopped after 100 iterations
plot(modell_nn6)

Auffällig bei diesem Modell mit diesen Trainingsdaten ist, dass das jeweils beste Modell mit jeweils 5 Hidden Units ist. Bei allen 3 unterschiedlichen Weights erreicht man mit 5 Hidden Units eine Trainingsaccuracy von knapp unter 92%. Das beste Modell ist mit einem Weight Decay von 0,1.

modell_nn6_best <- modell_nn6$bestTune
modell_nn6_best
##   size decay
## 9    5   0.1
predict_testNN_6 = predict(modell_nn6, test_eng_nn)
predict_testNN_6<-sapply(predict_testNN_6,round,digits=0)
nn_table6 <- table(test_eng_nn$target, predict_testNN_6)

Dieses Modell erreicht eine Accuracy von knapp unter 90% und eine Specificity von 50%. Positiv bei diesem Modell ist, dass lediglich 2 Patienten fälschlicherweise als gesund ausgegeben werden, obwohl sie erkrankt sind.

results_nn6 <- data.frame(actual = test_eng_nn$target, prediction = predict_testNN_6)
conf_nn6 <- confusionMatrix(table(results_nn6$actual,results_nn6$prediction))
conf_nn6
## Confusion Matrix and Statistics
## 
##    
##      0  1
##   0 86  8
##   1  2  9
##                                           
##                Accuracy : 0.9048          
##                  95% CI : (0.8318, 0.9534)
##     No Information Rate : 0.8381          
##     P-Value [Acc > NIR] : 0.0362          
##                                           
##                   Kappa : 0.5908          
##                                           
##  Mcnemar's Test P-Value : 0.1138          
##                                           
##             Sensitivity : 0.9773          
##             Specificity : 0.5294          
##          Pos Pred Value : 0.9149          
##          Neg Pred Value : 0.8182          
##              Prevalence : 0.8381          
##          Detection Rate : 0.8190          
##    Detection Prevalence : 0.8952          
##       Balanced Accuracy : 0.7533          
##                                           
##        'Positive' Class : 0               
## 
acc_nn6 <- conf_nn6$overall[1]
sens_nn6 <- conf_nn6$byClass[1]
spec_nn6 <- conf_nn6$byClass[2]

Explainer für das Neural Net Model:

Um die Neuronalen Netz Modelle etwas besser zu verstehen, haben wir nun noch die DALEX Library verwendet, um das NN-Modell, dass auf die Testdaten am besten abgeschnitten hat, besser erklären zu können.

library(DALEX)
## Warning: package 'DALEX' was built under R version 3.6.2
## Welcome to DALEX (version: 1.2.1).
## Find examples and detailed introduction at: https://pbiecek.github.io/ema/
## 
## Attaching package: 'DALEX'
## The following object is masked from 'package:dplyr':
## 
##     explain
#create Explainer
p_fun <- function(object, newdata){predict(object, newdata=newdata, type="prob")[,2]}

explainer_nn <- explain(modell_nn4, label = "nn", 
                                    data = data_test, y = as.numeric(data_test$target),
                                    colorize = FALSE, predict_function = p_fun) 
## Preparation of a new explainer is initiated
##   -> model label       :  nn 
##   -> data              :  105  rows  17  cols 
##   -> target variable   :  105  values 
##   -> model_info        :  package caret , ver. 6.0.85 , task Classification (  default  ) 
##   -> predict function  :  p_fun 
##   -> predicted values  :  numerical, min =  5.201416e-05 , mean =  0.1150474 , max =  0.949322  
##   -> residual function :  difference between y and yhat (  default  )
##   -> residuals         :  numerical, min =  0.05067799 , mean =  0.9897145 , max =  1.965565  
##   A new explainer has been created!
model_perf_nn <- model_performance(explainer_nn)
plot(model_perf_nn)

plot(model_perf_nn, geom = "boxplot")
## Warning: Ignoring unknown parameters: fun
## No summary function supplied, defaulting to `mean_se()

vi_classif_nn <- variable_importance(explainer_nn, loss_function = loss_root_mean_square)

Der Plot der Variable Importance zeigt, dass der Faktor sickness, den mit Abstand größten Einfluss auf die Prediction des Modells hat. Manche Blutwerte (Leukocytes, Hematocrit) hingegen haben so gut wie keinen Einfluss auf den Output.

plot(vi_classif_nn)

Nun erstellen wir noch 2 Partial Dependence Plots mit einer “unwichtigen” Variable, in unserem Fall den Leukocyten und der Variable mit dem größten Einfluss, dem Faktor ob jemand vorerkrankt ist oder nicht.

pdp_classif_nn_leuko <- variable_profile(explainer_nn, variable =  "Leukocytes", type = "partial" )
pdp_classif_nn_sick <- variable_profile(explainer_nn, variable =  "sickness", type = "partial" )
## 'variable_type' changed to 'categorical' due to lack of numerical variables.
plot(pdp_classif_nn_leuko)

Je höher der Leukocyt Gehalt im Blut, desto wahrscheinlicher ist es, dass bei unserem Modell jemand als gesund klassifiziert wird. Der Zusammenhang ist bei unserem Modell fast linear.

plot(pdp_classif_nn_sick)

Bei der Variable Sickness verhält es sich so, dass jemand ohne Vorerkrankung bei uns eher als Corona krank klassifiziert wird als jemand mit Vorerkrankung.

ale_classif_nn_leuko <- variable_profile(explainer_nn, variable =  "Leukocytes", type = "accumulated")
ale_classif_nn_sick <- variable_profile(explainer_nn, variable =  "sickness", type = "accumulated")
## 'variable_type' changed to 'categorical' due to lack of numerical variables.
plot(ale_classif_nn_leuko)

plot(ale_classif_nn_sick)

Naive Bayes Classifier

# Naive Bayes Classifier

set.seed(7267166)
trainIndex=createDataPartition(data_clean$target, p=0.7)$Resample1
train=data_clean[trainIndex, ]
test=data_clean[-trainIndex, ]

## check the balance
print(table(data_clean$target))
## 
##   0   1 
## 474  58
# Naive Bayes Classifier

library(e1071)

NBclassfier_clean=naiveBayes(target~., data=train)
print(NBclassfier_clean)
## 
## Naive Bayes Classifier for Discrete Predictors
## 
## Call:
## naiveBayes.default(x = X, y = Y, laplace = laplace)
## 
## A-priori probabilities:
## Y
##         0         1 
## 0.8900804 0.1099196 
## 
## Conditional probabilities:
##    Patient.age.quantile
## Y        [,1]     [,2]
##   0  9.259036 6.309663
##   1 14.024390 4.557893
## 
##    Patient.addmited.to.regular.ward..1.yes..0.no.
## Y            0          1
##   0 0.95180723 0.04819277
##   1 0.58536585 0.41463415
## 
##    Patient.addmited.to.semi.intensive.unit..1.yes..0.no.
## Y           0         1
##   0 0.9246988 0.0753012
##   1 0.8292683 0.1707317
## 
##    Patient.addmited.to.intensive.care.unit..1.yes..0.no.
## Y            0          1
##   0 0.95180723 0.04819277
##   1 0.90243902 0.09756098
## 
##    sickness
## Y            0          1
##   0 0.41867470 0.58132530
##   1 0.97560976 0.02439024
## 
##    Hematocrit
## Y         [,1]      [,2]
##   0 -0.1075208 0.8217811
##   1  0.2888207 0.7556972
## 
##    Platelets
## Y          [,1]      [,2]
##   0  0.03512984 0.7547930
##   1 -0.73513596 0.6355735
## 
##    Mean.platelet.volume
## Y          [,1]      [,2]
##   0 -0.02173547 0.7889366
##   1  0.13781037 0.7297791
## 
##    Lymphocytes
## Y          [,1]      [,2]
##   0 -0.05718726 0.8662317
##   1 -0.17805014 0.6997478
## 
##    Mean.corpuscular.hemoglobin.concentration..MCHC.
## Y          [,1]      [,2]
##   0 -0.05445320 0.8801287
##   1  0.09684454 0.8125258
## 
##    Leukocytes
## Y         [,1]      [,2]
##   0  0.1331424 0.8058705
##   1 -0.5721571 0.8686501
## 
##    Basophils
## Y            [,1]      [,2]
##   0 -0.0009014746 0.9948337
##   1 -0.1522362580 0.6939421
## 
##    Mean.corpuscular.hemoglobin..MCH.
## Y          [,1]      [,2]
##   0 -0.03964834 0.8006152
##   1 -0.13236144 0.9709085
## 
##    Eosinophils
## Y          [,1]      [,2]
##   0  0.05686628 0.9288753
##   1 -0.54435114 0.3297303
## 
##    Monocytes
## Y            [,1]      [,2]
##   0 -0.0002117383 0.8333964
##   1  0.3914686489 0.9893394
## 
##    Red.blood.cell.distribution.width..RDW.
## Y         [,1]      [,2]
##   0 0.12311207 0.9211221
##   1 0.05486193 1.0619366
printALL=function(model){
  trainPred=predict(model, newdata = train, type = "class")
  trainTable=table(train$target, trainPred)
  testPred=predict(model, newdata=test, type="class")
  testTable=table(test$target, testPred)
  trainAcc=(trainTable[1,1]+trainTable[2,2])/sum(trainTable)
  testAcc=(testTable[1,1]+testTable[2,2])/sum(testTable)
  message("Contingency Table for Training Data")
  print(trainTable)
  message("Contingency Table for Test Data")
  print(testTable)
  message("Accuracy")
  print(round(cbind(trainAccuracy=trainAcc, testAccuracy=testAcc),3))
}
printALL(NBclassfier_clean)
## Contingency Table for Training Data
##    trainPred
##       0   1
##   0 320  12
##   1  12  29
## Contingency Table for Test Data
##    testPred
##       0   1
##   0 137   5
##   1   8   9
## Accuracy
##      trainAccuracy testAccuracy
## [1,]         0.936        0.918

Naive Bayes auf den engineerten Daten

train <- read.csv("data/clean/train_feat_eng.csv")
test <- read.csv("data/clean/test_feat_eng.csv")
NBclassfier_eng <- naiveBayes(target~., data=train)
#printALL(NBclassfier_eng)
conf_nb <- confusionMatrix(table(test$target, predict(NBclassfier_clean, test)))

Tree based Models

library(rpart)
library(rpart.plot)
library(randomForest)
## randomForest 4.6-14
## Type rfNews() to see new features/changes/bug fixes.
## 
## Attaching package: 'randomForest'
## The following object is masked from 'package:ggplot2':
## 
##     margin
## The following object is masked from 'package:dplyr':
## 
##     combine
data_clean <- read.csv("data/clean/data_clean.csv")
set.seed(3456)
train_idx <- createDataPartition(data_clean$target, p = .8, 
                                  list = FALSE, 
                                  times = 1)

data_train <- data_clean[train_idx, ]
data_test <- data_clean[-train_idx, ]

train_feat_eng <- read.csv("data/clean/train_feat_eng.csv")
test_feat_eng <- read.csv("data/clean/test_feat_eng.csv")
#In einem ersten Versuch verwenden wir das Paket rpart und die originalen Featrues unseres Datensatzes.
set.seed(200989)
trees1_fit <- rpart(target ~., data = data_train, method = "class")

#Plot des ersten Fits
rpart.plot(trees1_fit)

Hier sehen wir, wie auch in anderen Modellen, dass die Leukozyten besonders wichtig für die Beurteilung / den Ausschluss einer Infektion sind.

#Suche nach dem minimalen Fehler
min_cp <- trees1_fit$cptable[which.min(trees1_fit$cptable[,"xerror"]),"CP"]
min_cp
## [1] 0.01
#Pruning des Trees
trees1_prune <- prune(trees1_fit, cp = min_cp)
trees1_pruned_test_prediction <- predict(trees1_prune, newdata = data_test, type = "class")

cf1 <- confusionMatrix(trees1_pruned_test_prediction, as.factor(data_test$target))
cf1
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction   0   1
##          0 101   1
##          1   2   2
##                                           
##                Accuracy : 0.9717          
##                  95% CI : (0.9195, 0.9941)
##     No Information Rate : 0.9717          
##     P-Value [Acc > NIR] : 0.6472          
##                                           
##                   Kappa : 0.5571          
##                                           
##  Mcnemar's Test P-Value : 1.0000          
##                                           
##             Sensitivity : 0.9806          
##             Specificity : 0.6667          
##          Pos Pred Value : 0.9902          
##          Neg Pred Value : 0.5000          
##              Prevalence : 0.9717          
##          Detection Rate : 0.9528          
##    Detection Prevalence : 0.9623          
##       Balanced Accuracy : 0.8236          
##                                           
##        'Positive' Class : 0               
## 
tree1_acc <- cf1[["overall"]][["Accuracy"]]
tree1_spec <- cf1[["byClass"]][["Specificity"]]
tree1_sens <- cf1[["byClass"]][["Sensitivity"]]
tree1_prec <- cf1[["byClass"]][["Precision"]]

Als Ergebnis der Anwendung unseres ersten Baummodels bekommen wir mit 97% Accuracy, 98% Sensitivity und einer Spezifität von rund 67% bereits sehr gute Ergebnisse. Im nächsten Schritt versuchen wir, das Ergebnis mit der Verwendung der von uns erstellten Features zu verbessern.

#Neuer Versuch mit neuen den neuen Features
trees2_fit <- rpart(target ~., data = train_feat_eng, method = "class")
trees2_prediction <- predict(trees2_fit, newdata = test_feat_eng, type = "class")
rpart.plot(trees2_fit)

#summary(trees2_fit)
#Test Confusion Matrix für den tree mit den angepassten Features
cf2 <- confusionMatrix(trees2_prediction, as.factor(test_feat_eng$target))
cf2
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction  0  1
##          0 80  3
##          1 14  8
##                                           
##                Accuracy : 0.8381          
##                  95% CI : (0.7535, 0.9028)
##     No Information Rate : 0.8952          
##     P-Value [Acc > NIR] : 0.97525         
##                                           
##                   Kappa : 0.4012          
##                                           
##  Mcnemar's Test P-Value : 0.01529         
##                                           
##             Sensitivity : 0.8511          
##             Specificity : 0.7273          
##          Pos Pred Value : 0.9639          
##          Neg Pred Value : 0.3636          
##              Prevalence : 0.8952          
##          Detection Rate : 0.7619          
##    Detection Prevalence : 0.7905          
##       Balanced Accuracy : 0.7892          
##                                           
##        'Positive' Class : 0               
## 
tree2_acc <- cf2[["overall"]][["Accuracy"]]
tree2_spec <- cf2[["byClass"]][["Specificity"]]
tree2_sens <- cf2[["byClass"]][["Sensitivity"]]
tree2_prec <- cf2[["byClass"]][["Precision"]]

Wir sehen nun, dass sich die Specificity auf Kosten der Accuracy und Sensitivität verbessert - insgesamt jedoch schlechtere Ergebnisse liefert.

Sehen wir nun, ob wir unsere ersten Ergebnisse mit einem Random Forest Modell verbessern können.

#Bagging mit einem RF mit den originalen Features
set.seed(200989)
rf1_fit <- randomForest(as.factor(target) ~ ., data = data_train, mtry = 2, importance = TRUE, ntrees = 220, type = "classification")
rf1_prediction <- predict(rf1_fit, newdata = data_test, type = "class")
cf_rf1 <- confusionMatrix(rf1_prediction, as.factor(data_test$target))
cf_rf1
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction   0   1
##          0 103   1
##          1   0   2
##                                           
##                Accuracy : 0.9906          
##                  95% CI : (0.9486, 0.9998)
##     No Information Rate : 0.9717          
##     P-Value [Acc > NIR] : 0.1949          
##                                           
##                   Kappa : 0.7954          
##                                           
##  Mcnemar's Test P-Value : 1.0000          
##                                           
##             Sensitivity : 1.0000          
##             Specificity : 0.6667          
##          Pos Pred Value : 0.9904          
##          Neg Pred Value : 1.0000          
##              Prevalence : 0.9717          
##          Detection Rate : 0.9717          
##    Detection Prevalence : 0.9811          
##       Balanced Accuracy : 0.8333          
##                                           
##        'Positive' Class : 0               
## 
rf1_acc <- cf_rf1[["overall"]][["Accuracy"]]
rf1_spec <- cf_rf1[["byClass"]][["Specificity"]]
rf1_sens <- cf_rf1[["byClass"]][["Sensitivity"]]
rf1_prec <- cf_rf1[["byClass"]][["Precision"]]

Das Random Forest Modell liefert uns mit einer Accuracy von 99% und einer und einer Spezifität von 67% extrem gute Ergebnisse. Sehen wir nun, ob uns die neuen Features noch etwas bringen:

#Random Forest mit den neuen Features
set.seed(200989)

rf2_fit <- randomForest(as.factor(target) ~ ., data = train_feat_eng, mtry = 15, 
                        importance = TRUE, ntrees = 100, type = "class")

rf2_prediction <- predict(rf2_fit, newdata = test_feat_eng, type = "class")

cf_rf2 <- confusionMatrix(rf2_prediction, as.factor(test_feat_eng$target))
cf_rf2
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction  0  1
##          0 92  3
##          1  2  8
##                                           
##                Accuracy : 0.9524          
##                  95% CI : (0.8924, 0.9844)
##     No Information Rate : 0.8952          
##     P-Value [Acc > NIR] : 0.03059         
##                                           
##                   Kappa : 0.7355          
##                                           
##  Mcnemar's Test P-Value : 1.00000         
##                                           
##             Sensitivity : 0.9787          
##             Specificity : 0.7273          
##          Pos Pred Value : 0.9684          
##          Neg Pred Value : 0.8000          
##              Prevalence : 0.8952          
##          Detection Rate : 0.8762          
##    Detection Prevalence : 0.9048          
##       Balanced Accuracy : 0.8530          
##                                           
##        'Positive' Class : 0               
## 
rf2_acc <- cf_rf2[["overall"]][["Accuracy"]]
rf2_spec <- cf_rf2[["byClass"]][["Specificity"]]
rf2_sens <- cf_rf2[["byClass"]][["Sensitivity"]]
rf2_prec <- cf_rf2[["byClass"]][["Precision"]]

Wie auch schon beim Tree Modell verschlechtern sich hier die Ergebnisse mit den neuen Features.

Zu guter letzt versuchen wir noch den besten Tune mit Hilfe einer zufälligen Suche zu finden. Das Suchsetup nutzt 15-fache CV und 3 Wiederholungen

# Zufällige Suche nach dem richtigen Setup
control_trees <- trainControl(method="repeatedcv", number=15, repeats=3, search="random")
set.seed(200989)
mtry_trees <- sqrt(16)
rf_random <- train(as.factor(target)~., data=data_train, method="rf", metric="Accuracy", tuneLength=15, trControl=control_trees)

print(rf_random)
## Random Forest 
## 
## 426 samples
##  16 predictor
##   2 classes: '0', '1' 
## 
## No pre-processing
## Resampling: Cross-Validated (15 fold, repeated 3 times) 
## Summary of sample sizes: 397, 399, 397, 399, 398, 398, ... 
## Resampling results across tuning parameters:
## 
##   mtry  Accuracy   Kappa    
##    1    0.8868434  0.1832382
##    3    0.9257658  0.5690219
##    5    0.9305044  0.6111560
##    6    0.9305317  0.6127604
##    8    0.9320917  0.6240885
##    9    0.9335969  0.6269844
##   10    0.9343632  0.6337862
##   11    0.9328032  0.6285357
##   12    0.9328032  0.6269938
##   13    0.9311571  0.6205962
##   15    0.9295131  0.6185752
## 
## Accuracy was used to select the optimal model using the largest value.
## The final value used for the model was mtry = 10.
plot(rf_random)

Wie schlägt sich dieses Modell mit mtry 10 nun bei der Vorhersage der Test-Daten?

rf_random_prediction <- predict(rf_random, newdata = data_test)
cf_rf_rand <- confusionMatrix(as.factor(rf_random_prediction), as.factor(data_test$target))
cf_rf_rand
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction   0   1
##          0 103   1
##          1   0   2
##                                           
##                Accuracy : 0.9906          
##                  95% CI : (0.9486, 0.9998)
##     No Information Rate : 0.9717          
##     P-Value [Acc > NIR] : 0.1949          
##                                           
##                   Kappa : 0.7954          
##                                           
##  Mcnemar's Test P-Value : 1.0000          
##                                           
##             Sensitivity : 1.0000          
##             Specificity : 0.6667          
##          Pos Pred Value : 0.9904          
##          Neg Pred Value : 1.0000          
##              Prevalence : 0.9717          
##          Detection Rate : 0.9717          
##    Detection Prevalence : 0.9811          
##       Balanced Accuracy : 0.8333          
##                                           
##        'Positive' Class : 0               
## 
rf_rand_acc <- cf_rf_rand[["overall"]][["Accuracy"]]
rf_rand_spec <- cf_rf_rand[["byClass"]][["Specificity"]]
rf_rand_sens <- cf_rf_rand[["byClass"]][["Sensitivity"]]
rf_rand_prec <- cf_rf_rand[["byClass"]][["Precision"]]

Auch dieses Setup liefert und die gleichen Ergebnisse wie der ursprüngliche Random Forest mit den originalen Features.

Zum Abschluss sehen wir uns noch einmal die Übersicht der Ergebnisse an:

library(kableExtra)
modell_trees <- c("Tree 1", "Tree 2", "RF 1", "RF 2", "RF rand")
tree_test_acc <- c(tree1_acc, tree2_acc, rf1_acc, rf2_acc, rf_rand_acc)
tree_sens <- c(tree1_sens, tree2_sens, rf1_sens, rf2_sens, rf_rand_sens)
tree_spec <- c(tree1_spec, tree2_spec, rf1_spec, rf2_spec, rf_rand_spec)
results_trees = data.frame(
  "model" = modell_trees,
  "sensitivity" = tree_sens,
  "Specificity" = tree_spec,
  "Accuracy" = tree_test_acc
)

kable_styling(kable(results_trees, reesformat = "html", digits = 4), full_width = FALSE)
model sensitivity Specificity Accuracy
Tree 1 0.9806 0.6667 0.9717
Tree 2 0.8511 0.7273 0.8381
RF 1 1.0000 0.6667 0.9906
RF 2 0.9787 0.7273 0.9524
RF rand 1.0000 0.6667 0.9906

Wir haben gesehen, dass insbesondere die RF Modelle auf unsere Test Daten sehr gute Ergebnisse liefern. Wie jedoch bereits beschrieben, liegt hier ein “rare cases” Problem vor und es bleibt abzuwarten, wie die Modelle in anderes balancierten Datensätzen performen.

Beste Modelle:

`

modell <- c("SVM Radial Sigma Kernel", "Neural Network", "Naive Bayes Classifier", "Random Forest")
accuracies <- c(CM_RS$overall[1], acc_nn4, conf_nb$overall[1], rf1_acc)
sensitivities <- c(CM_RS$byClass[1],sens_nn4, conf_nb$byClass[1], rf1_sens)
specificities <- c(CM_RS$byClass[2],spec_nn4, conf_nb$byClass[2], rf1_spec)
results_overall = data.frame(
  "model" = modell,
  "sensitivity" = sensitivities,
  "Specificity" = specificities,
  "Test Accuracy" = accuracies
)

kable_styling(kable(results_overall, format = "html", digits = 4), full_width = FALSE)
model sensitivity Specificity Test.Accuracy
SVM Radial Sigma Kernel 0.9789 0.9000 0.9714
Neural Network 0.9474 0.6000 0.9143
Naive Bayes Classifier 0.8842 0.0000 0.8000
Random Forest 1.0000 0.6667 0.9906

In der Übersichtstabelle kann man sehr gut erkennen, dass sowohl das beste SVM Modell als auch das beste Tree Modell eine sehr hohe Accuracy von über 97% erreichen. Auch das Neuronale Netz erreicht nich eine gute Accuracy von über 91%. Der Unterschied zwischen den Modellen wird aber sehr stark bemerkbar wenn man auf die Specificity schaut, diese ist beim SVM mit Radial Kernel und getuntem Sigma Parameter deutlich besser als bei allen anderen Modellen mit 90%. Dieses Modell würde sich eignen für die Klassifikation von Corona Infizierten und Nicht Coronainfizierten. Das neuronale Netz und der Random Forest mit etwas mehr als 60 % Specificity schneiden hierbei schlechter ab und würden wir deshalb weil es sich hier um sehr sensible Gesundheitsdaten handelt nicht “produktiv” verwenden. Der Naive Bayes Classifier schneidet bei uns nur leicht besser ab als das Base Modell aus dem Proposal, die logistische Regression.

Schlussfolgerung:

Wie nach der Explorativen Datenanalyse erwartet, ist bei den verschiedenen Modellen schwierig sowohl eine hohe Specificity als auch eine hohe Sensitivity zu erreichen. Unsere Modelle erzielen durchwegs eine hohe Sensitivität, wobei nur ein Modell eine Spezifizität von über 70% erreicht hat. Die Accuracy ist meistens auf einem hohen Niveau und die Unterschiede sind ziemlich gering zwischen den einzelnen Modellen. Die hohe Sensitivity ermöglicht ein relativ präzises Ausschließen von Nicht-Coronainfizierten, welches beim Testen ein wichtiger Faktor ist, um tatsächlich Infizierte besser behandeln und identifizieren zu können. Unser weiteres Vorgehen wäre, einen weiteren Test zu implementieren, der auf etwas anderen Faktoren beruht, um im Anschluss an unsere jetzige Klassifikation, die False Positives von den True Positives besser unterscheiden zu können. Im Großen und Ganzen kann man sagen, dass wir ein Modell gefunden haben, dass deutlich besser als 50:50 abschneiden würden beim Klassifizieren von Corona Patienten und dies ist bereits sehr viel Wert. Trotzdem muss gesagt werden, dass wir bei einer solch sensiblen Klassifikation auf jeden Fall ein nachgelagerten Test noch empfehlen würden.